Real-Time Ray Tracing Support Comes to GeForce GTX GPUs and Game Engines

The major issues is the denoiseing, the tensor cores do a really good job at it watching the Quake 2 demo he had he stated the denoiseing isn't perfect and you still get artifacts from the process.
 
For an enthusiast site it's amazing to see how little people understand about ray tracing, and how quickly they turn to conspiracy theories to convince themselves they are smart for avoiding RTX rather than just admitting the first gen is expensive. As soon as prices normalize with ray tracing everyone will be screaming about how necessary the RT cores are for performance. Be happy there are early adopters ready to pay the price in so that one day the masses will be able to get the same great tech for reduced prices. The butt hurt is strong in all these threads.

Be careful mods deleted my comment yesterday when I brought up all the anti-Nvidia conspiracies flying around these forums with regard to ray tracing.
 
I don't think there are "anti-Nvidia conspiracies" but I do think that there are a number of vocal people out there who have allowed their support for the underdog to override logic and common sense. These people don't just support AMD but
And maybe learn to read a little closer before making your attempt at trolling. Nowhere in my post did I use the word their. :rolleyes:
Yeah, think he must have been addressing FrozenSteel rather than you (but he quoted you instead!)
 
I'd prefer shader compute power since that improves both traditional games and RTX games as well. Also I'm not even sure if the RT cores do anything beneficial since turning on Ray-Tracing tanks performance regardless. But I guess we'll see when we can compare Radeons to Geforce to see if more compute is a better alternative.

Doesn't matter, just hand me the popcorn cause this is going to get good.

Isn't the reason why raytracing is tanking performance, that the rt cores are simply not powerful enough?

Or are they not the major bottleneck ?
 
Isn't the reason why raytracing is tanking performance, that the rt cores are simply not powerful enough?

Or are they not the major bottleneck ?
I'm not sure whether it's because they aren't powerful enough or because the software layer is lagging behind hardware. Hopefully as the software side of things matures we'll see improvements to performance, but to be honest it might take a combination of software improvements and iterative hardware improvements before RT fulfils its potential. If that means waiting for the next generation of RTX cards (or AMD's equivalents, if appropriate) then I'm fine with that - and as I mentioned previously, I'm happy that there are people out there willing to be early adopters for this tech, even if I'm not prepared to do so myself.
 
oh man this is great! I didn't buy into all the hype and they always said patience is a virtue!
now DX12 coming to win7
now ray tracing can work on GTX cards.
I may not need to upgrade after all muahaha!!!
 
Well that may have sped the announcement. But I suspect that was always the way it was heading. The real tracing push will come when AMD starts talking about Navi and their console parts. I doubt they have tensor flow hradware... but they are going to support ray tracing. People forget AMD was showing off real time ray tracing with Radeon Rays long before RTX.

IMO Nvidia used tracing to try and sell people on their latest GPUs early as they know full well when the real wave of traced games hit they will use shaders not tensor flow. I'm sure they will use their way its ment to be played money to get some "high end RTX" tracing into a handful of those games. Perhaps it will be faster perhaps not.

When you have enough compute sure, but we are no where in the near future for generalized core structures to calculate Ray Tracing close to being able to do this. Nvidia is using specialized compute cores to do that task and using tensor cores for denoiseing, Even the guy that did the RT Quake 2 stated using general cores for that purpose doesn't do a very good job with visible artifacts. I specifically like how people ignore wording, Crytek is using "mesh" RT which is different, gives the same effect of reflection and refraction without fully calculating everything, aka it's cheating to an extent but make no mistake NVidia is doing the same thing but a bit closer. The bottom line is if NVidia didn't release RTX 2000 series no one would be talking about it or using it, now that the extentions are in place within the API we will in all likelihood see many different types of implementations.
 
Dang Maxwell got left out of Adaptive Sync and RTX :coffee: would have been nice to play around with some basic lighting demos. Although it may still work with Cry engine. Maybe Nvidia is trying to tell me something.

Now get off my virtual rasterized lawn!
 
When you have enough compute sure, but we are no where in the near future for generalized core structures to calculate Ray Tracing close to being able to do this. Nvidia is using specialized compute cores to do that task and using tensor cores for denoiseing, Even the guy that did the RT Quake 2 stated using general cores for that purpose doesn't do a very good job with visible artifacts. I specifically like how people ignore wording, Crytek is using "mesh" RT which is different, gives the same effect of reflection and refraction without fully calculating everything, aka it's cheating to an extent but make no mistake NVidia is doing the same thing but a bit closer. The bottom line is if NVidia didn't release RTX 2000 series no one would be talking about it or using it, now that the extentions are in place within the API we will in all likelihood see many different types of implementations.

Nvidia marketing at its finest. Nvidia has done a whole lot of nothing. Accept code their tensor cores to the task and claim they where the ones driving this in games. I mean come on folks how many games where ready at RTX launch ? How many games have you seen so far purpose built to use ray tracing elements ?

Of course the answer is none and none. Games using traced elements for real are coming, they are being developed to launch with the PS5 and next gen xbox.

The Nvidia boosters seem to forget about PowerVR and AMD talking about this tech for years now. Accept PowerVR made it mobile... and AMD made it free. Of course whatever we get in terms of rasterized tracing is going to be a fake and not true pure ray tracing. We are still years away from real time ray tracing, if its ever actually realistic.
 
Nvidia marketing at its finest. Nvidia has done a whole lot of nothing. Accept code their tensor cores to the task and claim they where the ones driving this in games. I mean come on folks how many games where ready at RTX launch ? How many games have you seen so far purpose built to use ray tracing elements ?

Of course the answer is none and none. Games using traced elements for real are coming, they are being developed to launch with the PS5 and next gen xbox.

The Nvidia boosters seem to forget about PowerVR and AMD talking about this tech for years now. Accept PowerVR made it mobile... and AMD made it free. Of course whatever we get in terms of rasterized tracing is going to be a fake and not true pure ray tracing. We are still years away from real time ray tracing, if its ever actually realistic.


Yeah, everyone already talk
Nvidia marketing at its finest. Nvidia has done a whole lot of nothing. Accept code their tensor cores to the task and claim they where the ones driving this in games. I mean come on folks how many games where ready at RTX launch ? How many games have you seen so far purpose built to use ray tracing elements ?

Of course the answer is none and none. Games using traced elements for real are coming, they are being developed to launch with the PS5 and next gen xbox.

The Nvidia boosters seem to forget about PowerVR and AMD talking about this tech for years now. Accept PowerVR made it mobile... and AMD made it free. Of course whatever we get in terms of rasterized tracing is going to be a fake and not true pure ray tracing. We are still years away from real time ray tracing, if its ever actually realistic.


https://media.discordapp.net/attach...2128/RadeonVII_DXR_2.PNG?width=826&height=488

https://media.discordapp.net/attach...31/200_Ti_raytracing.PNG?width=830&height=487

Ray tracing has been around for a while, but keep this in mind a generalized generic core can't beat a specialized one, NVidia whether you guys like it or not attached workstation level cores specialized to do Ray tracing and handle denoiseing on the die, I get it the price is too high for most people, but suck it up buttercup, Consoles for all intents and purposes are entry level to lower mid-range chips, If you think for 1 sec that the equivalent of a budget Rx 570 could handle mass compute you are completely insane, could it cheat it sure by reducing overhead not casting to many Ray's or not fully calculating the path but instead a near field one sure it's possible even lowering the reflected resolution by a quarter may cheat enough fps, but don't get delusional. Also you are aware metro Exodus is using real RT for global illumination and works perfectly just fine, Battlefield did not use it correctly and SotTR can't get it running properly.
 
Nvidia marketing at its finest. Nvidia has done a whole lot of nothing. Accept code their tensor cores to the task and claim they where the ones driving this in games. I mean come on folks how many games where ready at RTX launch ? How many games have you seen so far purpose built to use ray tracing elements ?

Of course the answer is none and none. Games using traced elements for real are coming, they are being developed to launch with the PS5 and next gen xbox.

The Nvidia boosters seem to forget about PowerVR and AMD talking about this tech for years now. Accept PowerVR made it mobile... and AMD made it free. Of course whatever we get in terms of rasterized tracing is going to be a fake and not true pure ray tracing. We are still years away from real time ray tracing, if its ever actually realistic.


Yeah but that isn't in realtime that is rendering something in a week or more there bud. People have been using RT in photo and video production for a real long while now, Pixar in particular....so???? Point, this isn't in real time, NVidia offers workstation cards that do it in realtime for something like $16k where that used to be a $72k workstation cluster, and a 2080ti does this in a few hours apparently......
 
I'm seeing a lot of talk about Raytracing and Tensor Cores... Just to clarify, their two separate things. Tensor cores have nothing to do with Raytracing, they have everything to do with Nvidia DLSS (Deep Learning Super Sampling). Raytracing on RTX cards is offloaded to RT cores rather than running in the shaders. So in reality, Raytracing should be faster on RTX cards so long as the RT cores are utilized properly. Can Raytracing be done on shaders? Sure, but at the loss of performance as some sharers need to be consumed to perform the Raytracing operations. See the photo below which shows the difference. Note that the Tensor cores are seperated from the RT cores.

View attachment 149035
I'm seeing a lot of talk about Raytracing and Tensor Cores... Just to clarify, their two separate things. Tensor cores have nothing to do with Raytracing, they have everything to do with Nvidia DLSS (Deep Learning Super Sampling). Raytracing on RTX cards is offloaded to RT cores rather than running in the shaders. So in reality, Raytracing should be faster on RTX cards so long as the RT cores are utilized properly. Can Raytracing be done on shaders? Sure, but at the loss of performance as some sharers need to be consumed to perform the Raytracing operations. See the photo below which shows the difference. Note that the Tensor cores are seperated from the RT cores.

View attachment 149035
I find it’s usually the same people that don’t know how to use there, their and they’re properly who post opinions on things without doing their homework (if you had done yours in school you’d know how to use those words). If you’re going to get specific about tech in your posts, do a basic google search first before misinforming those who might be reading your garbage. This site seems to be a honeypot for anyone anti nvidia these days and it’s getting a bit ridiculous.
 
And maybe learn to read a little closer before making your attempt at trolling. Nowhere in my post did I use the word their. :rolleyes:


My apologies.. I was commenting on the post you were commenting on from FronzenSteel:
I've since replied directly to him/her (see the post prior to this one).

Exciting times ahead, and the companies that innovate are not the ones we should be bitching about.
Ya... Nvidia released some cards that were expensive and didn't perform perfectly when enabling RTX, but that's pretty standard stuff when it comes to introducing any new game changing tech in hardware and this generation of spoiled brats need to calm down and appreciate what has been done for them.

I've been reading Hardocp for decades. The only thing that makes any sense WRT the tone on this site these days is that the complainers are all invested in AMD ;)
 
I'm seeing a lot of talk about Raytracing and Tensor Cores... Just to clarify, their two separate things. Tensor cores have nothing to do with Raytracing, they have everything to do with Nvidia DLSS (Deep Learning Super Sampling). Raytracing on RTX cards is offloaded to RT cores rather than running in the shaders. So in reality, Raytracing should be faster on RTX cards so long as the RT cores are utilized properly. Can Raytracing be done on shaders? Sure, but at the loss of performance as some sharers need to be consumed to perform the Raytracing operations. See the photo below which shows the difference. Note that the Tensor cores are seperated from the RT cores.

Marketing slides aside, RT cores are only a dedicated pipeline in the SM with optimized alus/logic & register space/local cache for ray/tris intersection/BVH traversal in parallel with shader ops. The question is, more general purpose "shaders" or additional customized "shaders" at the expense of general alus? NV 's approach is custom logic, from trancendentals to matrix math, etc. That has a cost, die area & hip pocket.

You go out of your way to highlight and underline that tensor cores have nothing to do with ray tracing. Maybe do a few seconds of research as they are used for denoising the ray traced image.

Got 'em on board anyway, may as well use them for something. ;) With the low number of rays, denoising is a must, so a good use of tensor math when it comes to GI/shadows. We're a long way from full scene RTRT & that introduces as many problems as it solves. Actually, tensor cores are also used on RTX Turing for fp16. GTX Turing have "cut down tensor cores" added to the SM core for fp16.
 
NVIDIA HURT ITSELF IN ITS CONFUSION

LOL, not really, they already stated any GPU can Ray Trace, but it will suck at it due to lack of actual compute, hence why they added specialized core structures to the die(a generalized core will NEVER out perform something specialized for a very specific task, people spout non-sense all the time about Unified Shaders but the reality is you lose overhead because of it). People will run RT on GTX cards to get a taste and wish they had RTX cards when games start coming through the pipeline it usually takes a couple years like Tessellation for it to become actually usable.
 
is this the video that went with this story? ,


also, it would be great if Nvidia would let us see some benchmarks on something like a GTX 1070 etc. based on the fact that the Crytech demo seemed to run descent on that Vega 56, it's promising. the older traced Quake 2 version http://amietia.com/q2pt.html that you can actually ran on a GTX 1080 at a good rate it really has that next gen feel to it. the new update to it is amazing with the shadows etc.

nice to finally see some raytracing it's been teased for ages.
 
Nvidia marketing at its finest. Nvidia has done a whole lot of nothing. Accept code their tensor cores to the task and claim they where the ones driving this in games. I mean come on folks how many games where ready at RTX launch ? How many games have you seen so far purpose built to use ray tracing elements ?

Of course the answer is none and none. Games using traced elements for real are coming, they are being developed to launch with the PS5 and next gen xbox.

The Nvidia boosters seem to forget about PowerVR and AMD talking about this tech for years now. Accept PowerVR made it mobile... and AMD made it free. Of course whatever we get in terms of rasterized tracing is going to be a fake and not true pure ray tracing. We are still years away from real time ray tracing, if its ever actually realistic.


 
Has anyone loaded up the new driver yet and actually tried this?
I think the driver was released earlier this week.
Driver is 425.31

I saw on Guru 3D where the MetroExodus FPS drop from 50s in DX12 to 25 or so with the RTX effects enabled.

Apparently no DLSS, makes sense.
 
my 1080ti ran them with raytracing. like it says no dlss. the justiice demo allows to switch rtx on/off. its a slideshow at 4k, but 1080p is almost liveable
 
oh man this is great! I didn't buy into all the hype and they always said patience is a virtue!
now DX12 coming to win7
now ray tracing can work on GTX cards.
I may not need to upgrade after all muahaha!!!

Lol.. well I've got the new drivers installed, the Atomic Heart demo, the Quake2 raytraced exe, the Star Wars reflections demo, and run a 1080Ti FTW.

Atomic Heart: 5 to 15 fps, but it does look great.
Star Wars Reflections: about 10fps
Quake 2: 4 to 8 fps at my native display resolution of 3440x1440.

So if you got a Radeon card and a driver that supports DXR, you can expect to get half to 3/4 the numbers I am seeing, maybe same or a bit better with a Radeon VII. Nothing playable. Well, at 1080p maybe I could get 20 to 30fps.. I'll try it.
 
my 1080ti ran them with raytracing. like it says no dlss. the justiice demo allows to switch rtx on/off. its a slideshow at 4k, but 1080p is almost liveable

When I tried to run the Justice demo at 4k it wouldn't start. I got stuck at the start up window with the option to cancel being the only thing not greyed out.

I havent tried yet but someone over at guru3d told me they had to change resolution of the monitor to get it to work.
 
So if you got a Radeon card and a driver that supports DXR, you can expect to get half to 3/4 the numbers I am seeing, maybe same or a bit better with a Radeon VII. Nothing playable. Well, at 1080p maybe I could get 20 to 30fps.. I'll try it.

From a strategic perspective, I don't think AMD will enable DXR in drivers until they have a card that can do DXR with decent performance.

If you can't do it well, it's better to downplay DXR and not do it all until you have at least one product that can do it well.
 
Re-ran atomic heart after remembering that evga precision has an FPS display. On the 1080p resolution option, DLSS off of course, I saw between 11 and 45 FPS. average was around 24 ish, on ym 1080Ti.
 
RT on my 1070ti is ghastly and awful.

My 2080ti is well....

I have a feeling that nV allowed RT in non rtx cards as a marketing and PR stunt to bring awareness to just how intense the compute is on ray tracing.

I thoroughly enjoy ray tracing. It adds another dimension of realism to software. I hope it continues to developed and be pushed and I feel it will. And my 2080ti is doing a great job of ray tracing.

I hope Navi has RT hardware as well.
 
Not quite the thread for this but while doing some more RT and DLSS testing this weekend I noticed that either Metro or Tomb Raider, don't remember which(probably metro), actually let me use DLSS w/o using RT. For those with a 2070 or 2080 this could be a means of gaining frames in 4k. I did notice a FPS increase when I did it. It could probably help a 2060 reach impressive levels in 1440p.
 
Not quite the thread for this but while doing some more RT and DLSS testing this weekend I noticed that either Metro or Tomb Raider, don't remember which(probably metro), actually let me use DLSS w/o using RT. For those with a 2070 or 2080 this could be a means of gaining frames in 4k. I did notice a FPS increase when I did it. It could probably help a 2060 reach impressive levels in 1440p.

Indeed. Anthem doesn't offer RT but does DLSS which makes it totally enjoyable @ 4k with my 2070.

It's neat.
 
Back
Top