[RUMOUR] Nvidia to introduce add in cards for Ray Tracing

8.8% sucks if your trying to push new tech to software companies, no company will cater to that tiny market without cash under the table. Navi is also just as highly priced so I dont expect it to sell well either.

All the first generation has to do is get things moving, and with basically all new GPU HW going forward having HW RT, it's mission accomplished.

The argument about RT succeeding is settled.
 
All the first generation has to do is get things moving, and with basically all new GPU HW going forward having HW RT, it's mission accomplished.

The argument about RT succeeding is settled.

Plus, it's ray tracing. The literal holy grail of 3D rendering. No one is gonna 'not' do it once discovered how to do it (that works in any practicality).
 
Plus, it's ray tracing. The literal holy grail of 3D rendering. No one is gonna 'not' do it once discovered how to do it (that works in any practicality).

Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
 
Yeah, remember AMD's tessellation slider...to limit the games from running like the developer intended...due to AMD being far behind in tessellation-perfomance compared to NVIDIA.

/que "Too much raytracing..."

I actually think it’ll be different this time and AMD will have a solid RT solution. Of course that’s not going to stop the arguments about how much RT is “enough” depending on whether your favorite brand is in the lead. And just like with everything else some game engines will favor some hardware & drivers over others.

The difference with tessellation is that nvidia had performance to burn because they moved early toward distributed geometry processing.

With raytracing nobody has performance to burn. Every single ray has to count. So there won’t be any nonsense about tossing out a bunch of unnecessary rays.

I’m excited for all the cool new things developers will discover with ray tracing + denoising. Add to that the massive step forward in rasterization with mesh shaders and we’re in for a treat this generation.
 
Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update. I'm fairly uninformed, so as an uninformed person, let me give you my concern as an example.

Two "revolutionary" updates are upon us:
RT
VR

RT looks nice. it's too bad there is a large fps drop. It's nice that DLSS seems to help with that. Not really a "wow" moment like previous tech improvements such as 3d hardware acceleration or games that started using programmable shaders.
VR is a "wow" moment. if there is ANY reason that RT is causing VR to not be the best that it should be ( engineers could be optimizing VR instead of working on RT, gpu resources could be dedicated more to achieving ultra high resolutions at 120fps) then I'm saddened.

VR is my example. For others they may just want more rasterizing performance because they think 4k 240fps looks better than RT. Or maybe people are just concerned that the cost increase for RT is being passed onto them, when they don't really even care about RT.

But, again, as an uninformed person on the details, I don't know. I'll just play the games I get on the hardware I have :)
 
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update. I'm fairly uninformed, so as an uninformed person, let me give you my concern as an example.

Two "revolutionary" updates are upon us:
RT
VR

RT looks nice. it's too bad there is a large fps drop. It's nice that DLSS seems to help with that. Not really a "wow" moment like previous tech improvements such as 3d hardware acceleration or games that started using programmable shaders.
VR is a "wow" moment. if there is ANY reason that RT is causing VR to not be the best that it should be ( engineers could be optimizing VR instead of working on RT, gpu resources could be dedicated more to achieving ultra high resolutions at 120fps) then I'm saddened.

VR is my example. For others they may just want more rasterizing performance because they think 4k 240fps looks better than RT. Or maybe people are just concerned that the cost increase for RT is being passed onto them, when they don't really even care about RT.

But, again, as an uninformed person on the details, I don't know. I'll just play the games I get on the hardware I have :)

VR is certainly a bigger Wow, but the cost and aggravation are also MUCH higher. For me VR a novelty to try out, but most likely I will never have it at home, same way I think roller coaster are a big wow, but I will never build one in my back yard.

But it does occur to me that nowhere will RT have a bigger impact than in VR. The difference between proper perspective correct effects and faked ones will like be night in day when moving around in a VR world. Think about the real world when looking at a reflection, where small shift in your head movement, will shift reflection in just a certain way, that faked once will likely never do quite right, but RT will nail.

So VR fans should be very happy that RT is becoming table stakes, though the HW required to do RT and high refresh VR together will be scary.
 
But it does occur to me that nowhere will RT have a bigger impact than in VR. The difference between proper perspective correct effects and faked ones will like be night in day when moving around in a VR world. Think about the real world when looking at a reflection, where small shift in your head movement, will shift reflection in just a certain way, that faked once will likely never do quite right, but RT will nail.

So VR fans should be very happy that RT is becoming table stakes, though the HW required to do RT and high refresh VR together will be scary.
I question how important that is. Playing half-life alyx I didn't seem to mind that wet surface reflections were not accurate (i'm not sure i even noticed). I wish dxr and rt demos focused a lot more on global illumination then reflections. I think that effect is a lot more interesting.
 
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update.

That’s very true. However RT isn’t competing with VR as the most pressing problems there are around streamlining the user experience and interface. RT is competing with rasterization hacks and higher resolutions and framerates.

I don’t think anyone can argue that RT isn’t miles ahead of those raster hacks in terms of IQ “potential”. People seem to be saying that RT is a waste because the first incarnation didn’t change the world. Which of course is a silly position to take. The first 3D game didn’t look like Crysis either.

I much rather take those early steps now toward RT nirvana than keep inching along toward the end of the line for rasterization.

562C4D84-6D4B-4B7E-8B86-8E334E7F64DA.jpeg
 
Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
No one is "pushing back against" the technology itself. Just its forced inclusion on hardware that isn't very impressive at an exorbitant cost.

As I said in a previous post: RT is cool and I will buy it someday. But the benefit, to me, does not justify the price - and I seriously doubt I am alone in this regard.
 
That’s very true. However RT isn’t competing with VR as the most pressing problems there are around streamlining the user experience and interface. RT is competing with rasterization hacks and higher resolutions and framerates.

I don’t think anyone can argue that RT isn’t miles ahead of those raster hacks in terms of IQ “potential”. People seem to be saying that RT is a waste because the first incarnation didn’t change the world. Which of course is a silly position to take. The first 3D game didn’t look like Crysis either.

I much rather take those early steps now toward RT nirvana than keep inching along toward the end of the line for rasterization.

The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.
 
The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.

Minecraft RTX is basically this, almost nothing is even rasterized at all.

The camera obscura is bonkers
 
No one is "pushing back against" the technology itself. Just its forced inclusion on hardware that isn't very impressive at an exorbitant cost.

As I said in a previous post: RT is cool and I will buy it someday. But the benefit, to me, does not justify the price - and I seriously doubt I am alone in this regard.

Forced inclusion? What hardware feature have we as consumers had an opinion on in the past? Blaming RT for the price of Turing cards is speculative at best.
 
The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.

This echoes exactly what I said earlier. A lot of criticism of RT is that it’s not perfect on the first try. That is an extremely unreasonable and impractical stance. It took 20 years for rasterization to make it this far. I think we can afford to give raytracing a few years to get going.
 
Forced inclusion? What hardware feature have we as consumers had an opinion on in the past? Blaming RT for the price of Turing cards is speculative at best.
Are you serious? I'll say it again:

The RTX series cards were both first in class when it came to dedicated raytracing silicon, and they were also nearly twice as expensive as the previous generation. Asserting that those two facts are totally coincidental seems beyond speculation to me - it's flat out fanboyism. And worse, it's unnecessary fanboyism. I understand that Nvidia had to pay for their R&D. I get that. I'm not saying it's some amoral thing that they did. I'm also not criticizing RT for not being perfect on its first try, as you note in your next post. The first iteration of a technology is bound to be just that - a first iteration.

Can I assume you've already bought an RTX card? Maybe it's harder to rationalize once you've dropped that cash, but all I'm here to say is this: it is not a good value proposition. The increased performance in traditionally rendered games is NOT on par with the increased price, and the library of content that leverages RTX is still underwhelming.

We're not all in it for the value. Some of us like supporting the shiny and new. Some of us like having the newest whether it's shiny and new, or whether it's 14nm+++++++++++etc. Nothing wrong with either thing, it's your money, spend it how you want. I'm not saying no one should buy these things, just that choosing to not buy an RTX card and wait another generation makes plenty of sense. I will bet a large number of current 10-series owners in the enthusiast crowd have done just that.
 
This echoes exactly what I said earlier. A lot of criticism of RT is that it’s not perfect on the first try. That is an extremely unreasonable and impractical stance. It took 20 years for rasterization to make it this far. I think we can afford to give raytracing a few years to get going.

I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.

Minecraft RTX is basically this, almost nothing is even rasterized at all.

The camera obscura is bonkers

Well it's not quite as fully real time raytraced as they make it look. You discover some things are computed non-realtime, such as shadows which you can see opening chests. But all that said that is my point: You take one of the most graphically simple games out there, very low poly, simple materials, etc, etc and you CAN ray trace it... Fairly slowly. It still really needs DLSS to get performant. A 2080 Super can't hold 60fps at 1920x1080 (it gets close on average but drops below plenty).

Thus while it is really cool it is actually a good example of why full raytracing is NOT ready for primetime, and is quite a ways off. You take simple games like Minecraft or Quake 2 (which is 23 year old graphics) and ya, you can do (basically) full raytracing with ok performance so long as you have a couple of tricks. Great... but not replacing rasterization any time soon. Now hopefully it advances quickly and maybe it will. But of course you could also say "Well rasterization was doing Quake 2 23 years ago, so what happens if it takes raytracing 23 years to catch up to today's games?"

Just saying you have to understand where some of the skepticism comes from.
 
I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.



Well it's not quite as fully real time raytraced as they make it look. You discover some things are computed non-realtime, such as shadows which you can see opening chests. But all that said that is my point: You take one of the most graphically simple games out there, very low poly, simple materials, etc, etc and you CAN ray trace it... Fairly slowly. It still really needs DLSS to get performant. A 2080 Super can't hold 60fps at 1920x1080 (it gets close on average but drops below plenty).

Thus while it is really cool it is actually a good example of why full raytracing is NOT ready for primetime, and is quite a ways off. You take simple games like Minecraft or Quake 2 (which is 23 year old graphics) and ya, you can do (basically) full raytracing with ok performance so long as you have a couple of tricks. Great... but not replacing rasterization any time soon. Now hopefully it advances quickly and maybe it will. But of course you could also say "Well rasterization was doing Quake 2 23 years ago, so what happens if it takes raytracing 23 years to catch up to today's games?"

Just saying you have to understand where some of the skepticism comes from.
I think this is why making an ADD-ON ray tracing card would be interesting. Or basically a card designed primary for fully ray tracing a game. Wonder what the performance would be like.
 
Are you serious? I'll say it again:

The RTX series cards were both first in class when it came to dedicated raytracing silicon, and they were also nearly twice as expensive as the previous generation. Asserting that those two facts are totally coincidental seems beyond speculation to me - it's flat out fanboyism.

Your hypothesis is flawed and unfounded. New hardware features are added every generation without the regression in fps/$ we saw with Turing. What makes you think RT is special in terms of its impact on pricing? You’ve provided no evidence except to say “well A and B happened so obviously A was the reason for B”. Correlation is not causation etc etc.

I already laid out the obvious and practical reasons for Turing pricing earlier in the thread.
 
I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.

Nvidia is trying to sell graphics cards but the graphics industry embraced RT ages ago. The graphical fidelity that RT delivers isn’t a subjective thing. The science and research behind it is well established. Not to mention the results you can see with your own eyes.

What’s new is that real time gaming hardware is now catching up. Performance is key for sure but RT is an extremely parallel algorithm so throwing hardware at the problem should be easy enough.
 
That is the opposite of what is discussed here. That card has everything except Ray Tracing HW.
Well the Rumor was just a smidge off about what the add in card was that nVidia was announcing actually was. Unless they have it still kicking around on a back shelf awaiting its official debut.
 
Well the Rumor was just a smidge off about what the add in card was that nVidia was announcing actually was. Unless they have it still kicking around on a back shelf awaiting its official debut.

The rumor was made up nonsense, based on guesses from reading a patent, and looking at the strange cooler leak and not understanding how that worked.
 
Back
Top