Ray Tracing - Game Changer or Overhyped?

Ray Tracing - Game Changer or Overhyped?

  • Yes - Game Changer for sure

    Votes: 118 46.6%
  • No - Overhyped, not what was intended

    Votes: 135 53.4%

  • Total voters
    253
I don't think anyone is really questioning NV's focus on Raytracing. It's the occlusion of the other metrics that is suspect, that's what M76 was getting at.

RTX is all exciting until you realize very few games will use it in the next 12 months (AKA the main timeframe when Nvidia will sell 20 series before moving on to what's next), and only for certain effects at that. Therefore, rasterization performance is still a much more important topic until raytracing becomes more prevalent. Of course RTX is the new shiny stuff, but people are right to doubt and criticize NV for not sharing the values that actually matter for now: raster performance. I'm sure in the next 10 years the raster/raytrace balance will shift, and raytracing performance then could and should take precedence.

In the meantime, it's not very relevant to actual game performance = habitual NV shadiness. I'll be glad to see the 20 series kick butt, but even then, you can't ignore that it'll be a short-lived generation. 7nm is in production and around the corner, where the real benefits lay.

Spot on. I'd also add that in for example BF V the raytracing results look impressive on their own but to me don't really blend in well with the rest of the game, it's like they are too in your face compared to the more traditional techniques used for the rest of the visuals. While this may be closer to reality, that's how they look in game to me. By comparison Tomb Raider seems to use raytracing mostly for shadows and to me those look more like stuff that you will miss while actually playing the game. I feel like we have to have raytracing handle more things in a scene to be truly better than the very good faking efforts we are used to.

I'm also a bit concerned about where they can go in terms of performance. 7nm is already pretty damn small. I imagine they would start hitting bigger barriers with rasterization if they had not introduced the raytracing aspect. At this point raytracing takes us back to something like GTX 780 performance where it can handle 1080p pretty well but beyond that starts to struggle. I'm sure some of that can be alleviated with optimizations as developers get more experience with it, sort of like going from PS4 release titles to something like Horizon or God of War.
 
Real 7nm or PR 7 nm?

I mean, sure, but let's not pretend like cutting die size in half will have no tangible impact:

upload_2018-9-2_12-22-1.png

Of course that's PR speak, but you don't announce a 60% drop in power (from 16nm, not 14nm) and then deliver 10%. There is little doubt that die size will see an area reduction of around %70, which we know has been the main method to get sizable performance/power increases for the past three decades. There's a very clear reason the GTX 1080 uses 180W and the 2080 uses 215W: bare change in process from 14nm to 12nm. Likewise for RX 500 series VS 400: more performance at more power on the same/similar lithography.

So, PR spin or not, 7nm can stand on its own and given its timeline, should be available in products before next summer with a notable effect on performance/power.

in for example BF V the raytracing results look impressive on their own but to me don't really blend in well with the rest of the game

Agreed, I also saw more potential than actual improvement. Those windows and glass looked fantastic, but then the rest of the building was incongruent with their realism - it kind of threw me off. Probably because this is a) a tackled on raytracing into a mostly finished game and b) let's say %10 raytrace, %90 raster. To really see the jump in IQ we want, I'd guess we'll at least need to get to 50-50 balance for both rendering techniques to really blend credibly.
 
I think of it like when the first DirectX 12 GPUs came out, not many games supported it, and some of the early games saw performance loss. But DX12 was still a step forward, it just takes time for developers to learn a new API and how to optimize for it. Additionally, it also takes time for the market to grow to the point of making a DX12 only game viable, which there are still few. And you can't have a market unless GPU makers start rolling out new cards, so you have to start somewhere.
 
I think of it like when the first DirectX 12 GPUs came out, not many games supported it, and some of the early games saw performance loss. But DX12 was still a step forward, it just takes time for developers to learn a new API and how to optimize for it. Additionally, it also takes time for the market to grow to the point of making a DX12 only game viable, which there are still few. And you can't have a market unless GPU makers start rolling out new cards, so you have to start somewhere.
I thought we were still waiting for mass DX12 adoption?
 
I thought we were still waiting for mass DX12 adoption?

Well, it'd been adopted into all of the engines. Obviously games take a little time, but based on what we've seen so far, adoption is going very well and 'mass adoption' if perhaps pessimistically defined is occurring as we speak.
 
I thought we were still waiting for mass DX12 adoption?
Yes, we are. At least Vulkan has had some wins (with DOOM, for example) and that is very similar in capabilities as DX12. So we know there are advantages if done right. But developers may be hesitant to go all in, for example if they want to support Windows 7 (still a large market) or people with older GPUs. But things are moving in that direction, for example the new WoW expansion is using DX12 (with a DX11 option), and some Microsoft games have been DX12 exclusive. It takes time, though, that was my point.
 
At least Vulkan has had some wins (with DOOM, for example) and that is very similar in capabilities as DX12.

As much as I absolutely adore the work that id has done in making Doom and outstandingly efficient and responsive shooter, I do have to point out that it's more of the exception than the rule, at least until and if Bethesda decides to leverage that technology for their other IP families like Fallout and Elder Scrolls.

In which case, fuck yeah!
 
Yes, we are. At least Vulkan has had some wins (with DOOM, for example) and that is very similar in capabilities as DX12. So we know there are advantages if done right. But developers may be hesitant to go all in, for example if they want to support Windows 7 (still a large market) or people with older GPUs. But things are moving in that direction, for example the new WoW expansion is using DX12 (with a DX11 option), and some Microsoft games have been DX12 exclusive. It takes time, though, that was my point.
Seem like Vulkan would be better since it's open? But MS can't control that and make money or pay somebody money, so probably harder.
 
Seem like Vulkan would be better since it's open?

Better is always relative...

DX12 is close to what's used on the Xbox, one of the leading console platforms and alongside Windows represents a majority of non-phone game sales and thus development target.

Now, I'd prefer Vulkan myself as virtualization has advanced to the point that fluidity of host operating system is a thing, but as noted above we're not quite there yet.
 
Better is always relative...

DX12 is close to what's used on the Xbox, one of the leading console platforms and alongside Windows represents a majority of non-phone game sales and thus development target.

Now, I'd prefer Vulkan myself as virtualization has advanced to the point that fluidity of host operating system is a thing, but as noted above we're not quite there yet.

Heh Xbox is basically dead. The PS4 is so far ahead it's a joke. Microsoft might as well throw in the towel at this point.

The problem with consoles is that their "value" is in how awful they make their competition; not how good they actually are themselves. Consoles are only as good as their exclusives. Sony got all of them this generation. So Sony became better by making their competition worse.

It's why console gaming is sick and needs to die.
 
Yes, we are. At least Vulkan has had some wins (with DOOM, for example) and that is very similar in capabilities as DX12. So we know there are advantages if done right. But developers may be hesitant to go all in, for example if they want to support Windows 7 (still a large market) or people with older GPUs. But things are moving in that direction, for example the new WoW expansion is using DX12 (with a DX11 option), and some Microsoft games have been DX12 exclusive. It takes time, though, that was my point.

Not a major win since it's not much different than just running it with OpenGL. For actual wins they need to be able to demonstrate compelling differences, and neither D3D12 nor Vulkan has managed to do that yet.
 
Heh Xbox is basically dead. The PS4 is so far ahead it's a joke. Microsoft might as well throw in the towel at this point.

The problem with consoles is that their "value" is in how awful they make their competition; not how good they actually are themselves. Consoles are only as good as their exclusives. Sony got all of them this generation. So Sony became better by making their competition worse.

It's why console gaming is sick and needs to die.

You have market figures and quotes from executives and hell, even press releases to back that up?
 
if Bethesda decides to leverage that technology for their other IP families like Fallout and Elder Scrolls.

In which case, fuck yeah!

Yes, an exception but it proves that with the right optimization it can be done.

It still surprises me how well Doom runs on my simple 1060, or my previous rx 470 for that matter. I wish more games leveraged Vulkan, the uplift is quite tangible. I'm hoping the next Elder Scrolls will follow this route.
 
I wish more games leveraged Vulkan, the uplift is quite tangible. I'm hoping the next Elder Scrolls will follow this route.

The reason that I point out Doom as an exception is because its performance simply cannot be used as an example that could be applied to other games. And damn do I wish it could, but them's the wraps.

Even an Elder Scrolls or Fallout title I'd be suspicious of until I got my hands on it. How Doom 'feels' isn't something that I trust others to properly report on, let alone measure ;).
 
Not a major win since it's not much different than just running it with OpenGL. For actual wins they need to be able to demonstrate compelling differences, and neither D3D12 nor Vulkan has managed to do that yet.
DOOM had nearly 40% fps boost on certain systems when running Vulkan. Particularly on AMD GPUs and machines with weaker CPUs.
 
DOOM had nearly 40% fps boost on certain systems when running Vulkan. Particularly on AMD GPUs and machines with weaker CPUs.

Yeah, and the overwhelming majority of gamers own Nvidia cards, so as I said, no major wins. No one's impressed by framerate anyway. You need actual visual distinctions you can show off to be compelling. Most caps out there can't even tell the difference between 30fps and 60fps.
 
Spot on. I'd also add that in for example BFV the raytracing results look impressive on their own but to me don't really blend in well with the rest of the game, it's like they are too in your face compared to the more traditional techniques used for the rest of the visuals. While this may be closer to reality, that's how they look in game to me. By comparison Tomb Raider seems to use raytracing mostly for shadows and to me those look more like stuff that you will miss while actually playing the game. I feel like we have to have raytracing handle more things in a scene to be truly better than the very good faking efforts we are used to.

I'm also a bit concerned about where they can go in terms of performance. 7nm is already pretty damn small. I imagine they would start hitting bigger barriers with rasterization if they had not introduced the raytracing aspect. At this point raytracing takes us back to something like GTX 780 performance where it can handle 1080p pretty well but beyond that starts to struggle. I'm sure some of that can be alleviated with optimizations as developers get more experience with it, sort of like going from PS4 release titles to something like Horizon or God of War.

Yeah we’re definitely in the ‘let’s make it really obvious’ phase so puddles of water and glossy floors look like perfect mirrors which of course is ridiculous. The interplay of light and shadow in real life is a lot more subtle and you can see the same in offline raytraced renders. That’s where we need to get to in games.

The thing is, raytracing itself is a relatively simple algorithm. It’s up to the devs to define material parameters and write the shaders that actually compute color when a ray hits something. The current issues with BFV are 100% due to artistic direction and material setup. Those are things that they can experiment with and improve now that they have an actual RT solution to work with. Hopefully the final product is a bit more polished :D
 
I want to vote for both. I think the RTXs implementation of ray tracing is over hyped and doesn't add that much but if we ever get full real time ray tracing it will be freaking amazing. It's possible the the RTXs raytracing has possibilities but I think we need to wait for a generation or two to see how they develop. I think it also matters if AMD develops something comperable and it becomes a must have for games or if stays one of those Nvidia only features that don't get a ton of implementation and development.
 
I think the RTXs implementation of ray tracing is over hyped and doesn't add that much but if we ever get full real time ray tracing it will be freaking amazing.

It's only limited due to performance and nascency of implementation. As developers iterate the best way to use it, performance will go up.

It's possible the the RTXs raytracing has possibilities but I think we need to wait for a generation or two to see how they develop.

Well, that's going to happen. It's also functional now, in the engines now, and you'll be able to run it yourself very soon (before holiday season).

I think it also matters if AMD develops something comperable and it becomes a must have for games or if stays one of those Nvidia only features that don't get a ton of implementation and development.

Given that ray tracing has always been an end goal for gaming (and a mainstay for all other types of sub-realtime rendering), it's not comparable to vendor one-off features that both major vendors are guilty of. 'RTX' happens to be how Nvidia is marketing their inclusion of the technology into their products, but the base idea is older than the rasterization process that current games and graphics hardware use.

Now, while I have no doubt that AMD will implement ray tracing- probably calling it something else while supporting the same DX and Vulkan extensions in hardware- there are real questions with respect to their ability to produce competitive hardware at all. They're plenty capable of designing competitive (if lagging) hardware, but they're also currently two generations behind.

Intel may prove more competitive in this space especially when games and other applications start leaning more heavily or even completely on ray tracing for real-time rendering.
 
I think the RTXs implementation of ray tracing is over hyped and doesn't add that much but if we ever get full real time ray tracing it will be freaking amazing.
RTX *is* full ray tracing. It's just that developers have the learn how to use it and currently most of what we have seen use hybrid approaches where ray-tracing is used for just reflections, or just shadows, etc. but the hardware is there for developers to leverage.
 
RTX *is* full ray tracing. It's just that developers have the learn how to use it and currently most of what we have seen use hybrid approaches where ray-tracing is used for just reflections, or just shadows, etc. but the hardware is there for developers to leverage.

I'll add that it's surprising that we've arrived at a 'complete' solution from the onset; not only is ray tracing fully implemented, but it is running concurrently with and complementing rasterization.
 
Keep in mind that RTX is not just an "Nvidia thing". RTX is just how they call their implementation of DXR, which is a DirectX feature made by Microsoft in collaboration with both NV and AMD. It's likely there'll be Vulkan extensions, but DXR is mainly DirectX based - the name itself says it, DirectX Raytracing. It is not a question of whether AMD will make their own implementation (probably with a more marketable name of their Radeon Rays), they collaborated to make it, they know it's coming (and knew NV's hardware was coming before it was announced).

The next Radeon GPUs definitely will have some sort of DXR processing engine. Whether they're faster or slower than Nvidia (I'm expecting the latter, but who knows). Just a few days ago WCCFTech (I know, garbage source, but still) posted a snippet with a BFV developer saying they only talk to DXR, and then Nvidia's cards deal with those instructions through RTX. There are some dependencies (aka specific fixes for NV hardware) that won't work with AMD cards, but once they release their DXR-capable Radeons then the developers can apply similar optimizations for AMD hardware.

This is all the more exciting as IdiotInCharge said (sorry, I can't figure out how to "tag" usernames within the post) when you take Intel into account. I'd assume they were aware DXR was coming, but even if they weren't, it's now going to be pretty much required for them to have an engine to accelerate this feature. 3 players pushing the boundaries of raytracing means we'll for sure have DXR processing capabilities that are breathtaking compared to what we have now with Turing/RTX gen1.
 
euskalzabe The Vulkan ray-tracing extension is "coming soon" according to this post: https://github.com/KhronosGroup/Vulkan-Docs/issues/686#issuecomment-414798835

So, ray-tracing will be available both in DirectX 12 and Vulkan (shortly), not your typical proprietary feature. It's just a matter of AMD figuring out how to support it and I guess Intel too (if they are really showing up on the high end).

If you want to tag somone, type "@" and then start typing their name.
 
cybereality AH! I learned to tag people in the [H]! Thanks bud.

Nice on the vulkan stuff. I don't think AMD has to figure out how to support it, they collaborated to make DXR, so they already know how to. The issue will be if they can manufacture something good enough to process it efficiently.
 
Right, but I doubt it can be retro-fitted on existing architecture, and we don't know what kind of lead time they need. Meaning, if the next 1 or 2 generations of AMD GPUs are already in the pipeline, how flexible are they to add a new design this late in the game?
 
Right, but I doubt it can be retro-fitted on existing architecture, and we don't know what kind of lead time they need. Meaning, if the next 1 or 2 generations of AMD GPUs are already in the pipeline, how flexible are they to add a new design this late in the game?

If they have any relationship with Microsoft they would have heard it through the DirectX team indirectly in advance. Their fault if they didn’t.
 
Correct. What I'm saying is that Nvidia has been working on this for 10 years, and GPU architectures can take years to develop. I'm sure they had advance notice but, even still, it's not a quick thing. I hope they come though, of course.
 
Last edited:
Eh, I'd bet that AMD has it in the pipeline.

Probably one of the easiest features to support.
 
If they have any relationship with Microsoft they would have heard it through the DirectX team indirectly in advance. Their fault if they didn’t.

I just said AMD collaborated with MSFT just like NV to create DXR, they already knew it was coming because they helped create it...

Right, but I doubt it can be retro-fitted on existing architecture, and we don't know what kind of lead time they need. Meaning, if the next 1 or 2 generations of AMD GPUs are already in the pipeline, how flexible are they to add a new design this late in the game?

It already has been retrofitted. Not sure that's a good word, because it's just software that can run on shaders if hardware is not present. Microsoft explained this in their blog post announcement, where at the end they specify:

"What Hardware Will DXR Run On?
Developers can use currently in-market hardware to get started on DirectX Raytracing. There is also a fallback layer which will allow developers to start experimenting with DirectX Raytracing that does not require any specific hardware support. For hardware roadmap support for DirectX Raytracing, please contact hardware vendors directly for further details."
 
Yeah, DXR has a fallback which will work with AMD cards (or non-RTX Nvidia cards) but it's supposedly super slow and not usable for release.
 
So in essence..

Only nVdia has working hardware out.. and in the first not-so-fast edition
AMD and intel do not..
the consoles do not

so.. this will get adopted by the developers even slower than dx12. (unless nVidia pays them to add it as an aftertought)

but, am properbly still gonna buy a 2080ti if its +40% faster than 1080ti just to get the performance.
 
yea and we will need a 60 month loan to buy one
Technology gets cheaper after the first generation, not more expensive.
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.
T&L had its share of controversy and skeptics just as RT now. Sure, you didn't have to decrease the resolution, but it wasn't any faster than processors coming out with SSE and 3DNow! extensions at the time. The GeForce 256 was a few years too late to make a dramatic impact. RT is a few years too soon.

https://www.hardocp.com/article/2000/01/25/tnl_does_work_day_2
https://www.hardocp.com/article/2000/02/04/tnl_does_work_day_3
https://www.hardocp.com/article/2000/02/09/tnl_shines
 
Yeah, DXR has a fallback which will work with AMD cards (or non-RTX Nvidia cards) but it's supposedly super slow and not usable for release.

Indeed, but to be fair, it's the only way to access it if GPUs don't have any dedicate DXR-accelerating hardware. Point being, there is 0 chance AMD's next GPUs won't have raytracing hardware. Considering they've acknowledged waiting on 7nm, plus their much lower R&D money compared to NV, it's all making sense that their next consumer GPUs won't be ready until next summer.

Also, remember that Navi is being designed also with XB/PS5 in mind, which strongly suggests the next consoles will have at least 1st gen raytracing capacity. That means, by 2020, we'll have 2nd gen raytracing GPUs, and 1st gen raytracing consoles. This will ensure DXR is not a niche feature, but it'll become pretty common in a couple years.
 
T&L had its share of controversy and skeptics just as RT now... The GeForce 256 was a few years too late to make a dramatic impact. RT is a few years too soon.

Doing a simple word switch with Anandtech's 1999 GF 256 review is uncanny:

"NVIDIA hopes to change that by delivering a graphics chip that can handle more polygons than even the fastest CPUs could handle and, thus, they hope to promote the increase in polygon counts in games. The GeForce 256 itself can drive a peak polygon throughput of 15 Million polygons per second, an amazing feat for a graphics card. There is no arguing that the number of polygons used in games will definitely increase over time as graphics accelerators get more powerful, but it is naïve to say that the GeForce alone will promote this."

Update it 20 years later:

NVIDIA hopes to change that by delivering a graphics chip that can handle more raytracing than even the fastest CPUs could handle and, thus, they hope to promote the increase in raytracing calculations in games. The GeForce RTX 2080 Ti itself can drive a peak polygon throughput of 10 gigarays per second, an amazing feat for a graphics card. There is no arguing that the number of raytracing effects used in games will definitely increase over time as graphics accelerators get more powerful, but it is naïve to say that the GeForce alone will promote this.

And indeed, DXR will go nowhere fast without the support of AMD and Intel. Which is pretty much a given, making DXR future proof.
 
I just said AMD collaborated with MSFT just like NV to create DXR, they already knew it was coming because they helped create it...

Yes, you said that. Do you have any evidence of serious AMD participation? No doubt AMD would have been informed and given some chance for feedback, but this looks much more like a NV and MS partnership.

NVidia and Microsoft have been working in lockstep on Ray Tracing. NVidia has stated several times they partnered with Microsoft on Raytracing.

The first reveal of Microsoft DXR was in March GDC 2018, which was a huge NVidia Raytracing showcase. Just google GDC and Raytracing. It's all NVidia and Microsoft:
https://www.geeks3d.com/20180322/gdc-2018-directx-raytracing-dxr-videos-links/

In short, the DXR announcement was primarily a NVidia showcase, so it really shows limited AMD collaboration, or they would have had something...

All the Ray tracing demos were NVidia Demos. NVidia announced immediate day 1 support with Volta HW for DXR, AMD was mute.

Certainly there is a DX Raytracing API and AMD will support it someday, and they will give feedback to MS, but NVidia really seems to be the driver of this.
 
NVidia really seems to be the driver of this.

Nvidia has been in the driver's seat since DX10, largely due to ATi and then AMD's complacency. And yet we have DX12 AMD GPUs.

AMD almost certainly has a product nearly ready. If they don't, we should expect their announcement of the sale of RTG to Intel presently.
 
Indeed, but to be fair, it's the only way to access it if GPUs don't have any dedicate DXR-accelerating hardware. Point being, there is 0 chance AMD's next GPUs won't have raytracing hardware. Considering they've acknowledged waiting on 7nm, plus their much lower R&D money compared to NV, it's all making sense that their next consumer GPUs won't be ready until next summer.

Also, remember that Navi is being designed also with XB/PS5 in mind, which strongly suggests the next consoles will have at least 1st gen raytracing capacity. That means, by 2020, we'll have 2nd gen raytracing GPUs, and 1st gen raytracing consoles. This will ensure DXR is not a niche feature, but it'll become pretty common in a couple years.

I have a hard time believing next gen consoles would have the kind of horsepower needed to handle raytracing in any capacity. Too soon to be able to do that at low cost. We are more likely to see the GPU used to provide better framerates at checkerboard 4K.

AMD's desktop hardware will have raytracing support but I think we need to go all the way to PS6 for consoles to get on that bandwagon.
 
Yes, you said that. Do you have any evidence of serious AMD participation? No doubt AMD would have been informed and given some chance for feedback, but this looks much more like a NV and MS partnership.

Ah, it is exhausting when people don't read posts properly... a) it's not my job to find proof for you, you can google things like anyone else. But since I'm nice, LMGTFY: among many others, Anandtech said:

"Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DXR acceleration. The tone of AMD’s announcement makes me think that they will have very limited hardware acceleration relative to NVIDIA, but we’ll have to wait and see just what AMD unveils once their drivers are available."

And b) I did not comment on the degree of AMD's participation in the DXR endeavor. I responded to the following comment:
upload_2018-9-7_11-31-21.png

The comment mentioned AMD's fault if they didn't know about DXR. To which I replied that I had already mentioned that they did know well in advance, as they were involved in its development. I didn't comment at all on how much AMD or NV contributed to DXR, just emphasized that they knew and obviously aren't blindsided in some way - because otherwise we're implying that only NV is on the ball about new developments, which is not true. Are they the main driver, yes, are they the only driver, no.

I have a hard time believing next gen consoles would have the kind of horsepower needed to handle raytracing in any capacity. Too soon to be able to do that at low cost. We are more likely to see the GPU used to provide better framerates at checkerboard 4K. AMD's desktop hardware will have raytracing support but I think we need to go all the way to PS6 for consoles to get on that bandwagon.

I strongly disagree. You think new consoles in 2020 won't have GTX 2070 equivalent performance? Fat chance. XB1 and PS4Pro already do 4K checkerboard, that wouldn't sell new consoles, it's not progress. Consoles these days use regular parts already in production, tweaked for their custom designs. Then that design is solidified for a number of years where all devs develop to the same static HW configuration. Since we established AMD already is involved with DXR, and we know all their new stuff is coming at 7nm, and it's been reported that their next GPU is related to XB/PS5, put all those three together and you already know there'll be a degree of raytracing in consoles in 2020. That doesn't mean it'll all be raytraced, but certainly you'll get 2070 performance on XB/PS5. Add to that a 5 year lifetime that always shows great improvement due to the locked-down spec, and you pretty much have guaranteed decent use of raytraced effects in the next console generation.

You just have to read the news and strong rumors, and connect the dots. Nothing is %100 definite, but the likelihood is very high.
 
Last edited:
Back
Top