[RUMOUR] Nvidia to introduce add in cards for Ray Tracing

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,662
RTX cards are 8.8% and yet the 1060 by itself is 11.80%. Your right the market did speak they decided to not upgrade as the current cost of new cards. Also Pascal is 34.79% as per steam hardware survey which shows just how loudly people think it's not worth upgrading, so yes their choice was clear.
And Quad Cores are almost 50% of the Survey, so Ryzen must be kind of irrelevant failure, right?

You have to recognize that not everyone runs out upgrade annually, nor even every two years, and when people upgrade, what do what with their old cards? Burn them? As long as cards are usable, they will stay in circulation. Pascal was NVidia biggest success stories ever, they will be in circulation for a long time. I don't think anyone is arguing that Turing was a bigger success than Pascal.

But 8.8% is not bad for RTX cards given how expensive the RTX cards are, while Navi cards have yet to reach 1%.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,671
Digital Foundry has some analysis of PS5 Ray Tracing in the demos. They appear to be relying on many of the same optimizations, that current RTX games use, so it doesn't look like there is any breakthrough on RT HW peformance, nor anything that makes RT less resource intensive:
I mean, ray tracing is old as fuck, there really isn't going to be any magic optimizations to make it suddenly fast. It is an algorithm from the early days of computers, we've had a lot of time to work on it. Also its overall simplicity (CS students often write a ray tracer as a school project) is one of the things that makes it cool, but also means that big optimizations aren't really in the cards. It is doing the same simple thing, over and over and over.

The speedups we see and will continue to see are going to be brute force, lots of hardware dedicated to doing it fast. That will take time, as there's only so much you can cram on to a chip. It also means that we aren't likely to see AMD or nVidia or Intel magically pull way ahead, performance will be limited by how much silicon they can throw at it.
 

Gideon

2[H]4U
Joined
Apr 13, 2006
Messages
2,669
And Quad Cores are almost 50% of the Survey, so Ryzen must be kind of irrelevant failure, right?

You have to recognize that not everyone runs out upgrade annually, nor even every two years, and when people upgrade, what do what with their old cards? Burn them? As long as cards are usable, they will stay in circulation. Pascal was NVidia biggest success stories ever, they will be in circulation for a long time. I don't think anyone is arguing that Turing was a bigger success than Pascal.

But 8.8% is not bad for RTX cards given how expensive the RTX cards are, while Navi cards have yet to reach 1%.
Cpu power is based on need, if a quad core is all you need then thats all you buy, if you need more cores then buy appropriately as more cores only works if you got the software to use it. Your comparison is nothing like what were discussing as it relates to a new gpu feature that came with a massive price increase and Ray Tracing performance leaves a lot to be desired as well. Also plenty people are still on much older hardware then even Pascal and that still passed on the increased price of current cards be it AMD or Nvidia.

8.8% sucks if your trying to push new tech to software companies, no company will cater to that tiny market without cash under the table. Navi is also just as highly priced so I dont expect it to sell well either. Also you need to factor in that 2.38% of that 8.8% is a 2060 then yeah they are not doing much if any Ray Tracing. Under performing and over priced has been the moto of this generation of cards, I am hoping it improves with the coming generation or you better get used to Consoles being the dominant market again for gaming.
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,662
8.8% sucks if your trying to push new tech to software companies, no company will cater to that tiny market without cash under the table. Navi is also just as highly priced so I dont expect it to sell well either.
All the first generation has to do is get things moving, and with basically all new GPU HW going forward having HW RT, it's mission accomplished.

The argument about RT succeeding is settled.
 

staknhalo

[H]ard|Gawd
Joined
Jun 11, 2007
Messages
1,327
All the first generation has to do is get things moving, and with basically all new GPU HW going forward having HW RT, it's mission accomplished.

The argument about RT succeeding is settled.
Plus, it's ray tracing. The literal holy grail of 3D rendering. No one is gonna 'not' do it once discovered how to do it (that works in any practicality).
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
Plus, it's ray tracing. The literal holy grail of 3D rendering. No one is gonna 'not' do it once discovered how to do it (that works in any practicality).
Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
Yeah, remember AMD's tessellation slider...to limit the games from running like the developer intended...due to AMD being far behind in tessellation-perfomance compared to NVIDIA.

/que "Too much raytracing..."
I actually think it’ll be different this time and AMD will have a solid RT solution. Of course that’s not going to stop the arguments about how much RT is “enough” depending on whether your favorite brand is in the lead. And just like with everything else some game engines will favor some hardware & drivers over others.

The difference with tessellation is that nvidia had performance to burn because they moved early toward distributed geometry processing.

With raytracing nobody has performance to burn. Every single ray has to count. So there won’t be any nonsense about tossing out a bunch of unnecessary rays.

I’m excited for all the cool new things developers will discover with ray tracing + denoising. Add to that the massive step forward in rasterization with mesh shaders and we’re in for a treat this generation.
 

serpretetsky

[H]ard|Gawd
Joined
Dec 24, 2008
Messages
1,757
Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update. I'm fairly uninformed, so as an uninformed person, let me give you my concern as an example.

Two "revolutionary" updates are upon us:
RT
VR

RT looks nice. it's too bad there is a large fps drop. It's nice that DLSS seems to help with that. Not really a "wow" moment like previous tech improvements such as 3d hardware acceleration or games that started using programmable shaders.
VR is a "wow" moment. if there is ANY reason that RT is causing VR to not be the best that it should be ( engineers could be optimizing VR instead of working on RT, gpu resources could be dedicated more to achieving ultra high resolutions at 120fps) then I'm saddened.

VR is my example. For others they may just want more rasterizing performance because they think 4k 240fps looks better than RT. Or maybe people are just concerned that the cost increase for RT is being passed onto them, when they don't really even care about RT.

But, again, as an uninformed person on the details, I don't know. I'll just play the games I get on the hardware I have :)
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,662
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update. I'm fairly uninformed, so as an uninformed person, let me give you my concern as an example.

Two "revolutionary" updates are upon us:
RT
VR

RT looks nice. it's too bad there is a large fps drop. It's nice that DLSS seems to help with that. Not really a "wow" moment like previous tech improvements such as 3d hardware acceleration or games that started using programmable shaders.
VR is a "wow" moment. if there is ANY reason that RT is causing VR to not be the best that it should be ( engineers could be optimizing VR instead of working on RT, gpu resources could be dedicated more to achieving ultra high resolutions at 120fps) then I'm saddened.

VR is my example. For others they may just want more rasterizing performance because they think 4k 240fps looks better than RT. Or maybe people are just concerned that the cost increase for RT is being passed onto them, when they don't really even care about RT.

But, again, as an uninformed person on the details, I don't know. I'll just play the games I get on the hardware I have :)
VR is certainly a bigger Wow, but the cost and aggravation are also MUCH higher. For me VR a novelty to try out, but most likely I will never have it at home, same way I think roller coaster are a big wow, but I will never build one in my back yard.

But it does occur to me that nowhere will RT have a bigger impact than in VR. The difference between proper perspective correct effects and faked ones will like be night in day when moving around in a VR world. Think about the real world when looking at a reflection, where small shift in your head movement, will shift reflection in just a certain way, that faked once will likely never do quite right, but RT will nail.

So VR fans should be very happy that RT is becoming table stakes, though the HW required to do RT and high refresh VR together will be scary.
 

serpretetsky

[H]ard|Gawd
Joined
Dec 24, 2008
Messages
1,757
But it does occur to me that nowhere will RT have a bigger impact than in VR. The difference between proper perspective correct effects and faked ones will like be night in day when moving around in a VR world. Think about the real world when looking at a reflection, where small shift in your head movement, will shift reflection in just a certain way, that faked once will likely never do quite right, but RT will nail.

So VR fans should be very happy that RT is becoming table stakes, though the HW required to do RT and high refresh VR together will be scary.
I question how important that is. Playing half-life alyx I didn't seem to mind that wet surface reflections were not accurate (i'm not sure i even noticed). I wish dxr and rt demos focused a lot more on global illumination then reflections. I think that effect is a lot more interesting.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
It's the precieved idea that RT is competing with other tech/resources for a very small graphical update.
That’s very true. However RT isn’t competing with VR as the most pressing problems there are around streamlining the user experience and interface. RT is competing with rasterization hacks and higher resolutions and framerates.

I don’t think anyone can argue that RT isn’t miles ahead of those raster hacks in terms of IQ “potential”. People seem to be saying that RT is a waste because the first incarnation didn’t change the world. Which of course is a silly position to take. The first 3D game didn’t look like Crysis either.

I much rather take those early steps now toward RT nirvana than keep inching along toward the end of the line for rasterization.

562C4D84-6D4B-4B7E-8B86-8E334E7F64DA.jpeg
 

VanGoghComplex

[H]ard|Gawd
Joined
Apr 5, 2016
Messages
1,995
Yeah it‘s kinda mind boggling. The people pushing back against RT would probably have been mad about games going 3D or TVs getting color.
No one is "pushing back against" the technology itself. Just its forced inclusion on hardware that isn't very impressive at an exorbitant cost.

As I said in a previous post: RT is cool and I will buy it someday. But the benefit, to me, does not justify the price - and I seriously doubt I am alone in this regard.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,671
That’s very true. However RT isn’t competing with VR as the most pressing problems there are around streamlining the user experience and interface. RT is competing with rasterization hacks and higher resolutions and framerates.

I don’t think anyone can argue that RT isn’t miles ahead of those raster hacks in terms of IQ “potential”. People seem to be saying that RT is a waste because the first incarnation didn’t change the world. Which of course is a silly position to take. The first 3D game didn’t look like Crysis either.

I much rather take those early steps now toward RT nirvana than keep inching along toward the end of the line for rasterization.
The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.
 

socK

2[H]4U
Joined
Jan 25, 2004
Messages
3,899
The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.
Minecraft RTX is basically this, almost nothing is even rasterized at all.

The camera obscura is bonkers
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
No one is "pushing back against" the technology itself. Just its forced inclusion on hardware that isn't very impressive at an exorbitant cost.

As I said in a previous post: RT is cool and I will buy it someday. But the benefit, to me, does not justify the price - and I seriously doubt I am alone in this regard.
Forced inclusion? What hardware feature have we as consumers had an opinion on in the past? Blaming RT for the price of Turing cards is speculative at best.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
The question is will we be able to do RT fast enough to do it with out a bunch of hacks? Right now, it is too slow unless you have a very simple game, and want to run it very low rez and use something like DLSS to upscale it (which is a hack of sorts) OR you want to use it to only do a couple of effects. The real benefit for ray tracing would be if you have a full, no BS, implementation will global illumination, caustics, and all that so that you no longer had to have the art department worry about it, you could just design your materials, design your models, design your world, and everything would look "right".

Right now it is of questionable usefulness as it more or less remains a hack, just one of a different sort. You use it to make something looks nifty, but it requires implementation for that specific thing.

We'll see in the long run. I remain a little skeptical that hardware will get fast enough because people have been on about realtime raytracing for so long and it has never happened, but then I do know how much hardware has improved in the past and I could see it happening.
This echoes exactly what I said earlier. A lot of criticism of RT is that it’s not perfect on the first try. That is an extremely unreasonable and impractical stance. It took 20 years for rasterization to make it this far. I think we can afford to give raytracing a few years to get going.
 

VanGoghComplex

[H]ard|Gawd
Joined
Apr 5, 2016
Messages
1,995
Forced inclusion? What hardware feature have we as consumers had an opinion on in the past? Blaming RT for the price of Turing cards is speculative at best.
Are you serious? I'll say it again:

The RTX series cards were both first in class when it came to dedicated raytracing silicon, and they were also nearly twice as expensive as the previous generation. Asserting that those two facts are totally coincidental seems beyond speculation to me - it's flat out fanboyism. And worse, it's unnecessary fanboyism. I understand that Nvidia had to pay for their R&D. I get that. I'm not saying it's some amoral thing that they did. I'm also not criticizing RT for not being perfect on its first try, as you note in your next post. The first iteration of a technology is bound to be just that - a first iteration.

Can I assume you've already bought an RTX card? Maybe it's harder to rationalize once you've dropped that cash, but all I'm here to say is this: it is not a good value proposition. The increased performance in traditionally rendered games is NOT on par with the increased price, and the library of content that leverages RTX is still underwhelming.

We're not all in it for the value. Some of us like supporting the shiny and new. Some of us like having the newest whether it's shiny and new, or whether it's 14nm+++++++++++etc. Nothing wrong with either thing, it's your money, spend it how you want. I'm not saying no one should buy these things, just that choosing to not buy an RTX card and wait another generation makes plenty of sense. I will bet a large number of current 10-series owners in the enthusiast crowd have done just that.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,671
This echoes exactly what I said earlier. A lot of criticism of RT is that it’s not perfect on the first try. That is an extremely unreasonable and impractical stance. It took 20 years for rasterization to make it this far. I think we can afford to give raytracing a few years to get going.
I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.

Minecraft RTX is basically this, almost nothing is even rasterized at all.

The camera obscura is bonkers
Well it's not quite as fully real time raytraced as they make it look. You discover some things are computed non-realtime, such as shadows which you can see opening chests. But all that said that is my point: You take one of the most graphically simple games out there, very low poly, simple materials, etc, etc and you CAN ray trace it... Fairly slowly. It still really needs DLSS to get performant. A 2080 Super can't hold 60fps at 1920x1080 (it gets close on average but drops below plenty).

Thus while it is really cool it is actually a good example of why full raytracing is NOT ready for primetime, and is quite a ways off. You take simple games like Minecraft or Quake 2 (which is 23 year old graphics) and ya, you can do (basically) full raytracing with ok performance so long as you have a couple of tricks. Great... but not replacing rasterization any time soon. Now hopefully it advances quickly and maybe it will. But of course you could also say "Well rasterization was doing Quake 2 23 years ago, so what happens if it takes raytracing 23 years to catch up to today's games?"

Just saying you have to understand where some of the skepticism comes from.
 

serpretetsky

[H]ard|Gawd
Joined
Dec 24, 2008
Messages
1,757
I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.



Well it's not quite as fully real time raytraced as they make it look. You discover some things are computed non-realtime, such as shadows which you can see opening chests. But all that said that is my point: You take one of the most graphically simple games out there, very low poly, simple materials, etc, etc and you CAN ray trace it... Fairly slowly. It still really needs DLSS to get performant. A 2080 Super can't hold 60fps at 1920x1080 (it gets close on average but drops below plenty).

Thus while it is really cool it is actually a good example of why full raytracing is NOT ready for primetime, and is quite a ways off. You take simple games like Minecraft or Quake 2 (which is 23 year old graphics) and ya, you can do (basically) full raytracing with ok performance so long as you have a couple of tricks. Great... but not replacing rasterization any time soon. Now hopefully it advances quickly and maybe it will. But of course you could also say "Well rasterization was doing Quake 2 23 years ago, so what happens if it takes raytracing 23 years to catch up to today's games?"

Just saying you have to understand where some of the skepticism comes from.
I think this is why making an ADD-ON ray tracing card would be interesting. Or basically a card designed primary for fully ray tracing a game. Wonder what the performance would be like.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
Are you serious? I'll say it again:

The RTX series cards were both first in class when it came to dedicated raytracing silicon, and they were also nearly twice as expensive as the previous generation. Asserting that those two facts are totally coincidental seems beyond speculation to me - it's flat out fanboyism.
Your hypothesis is flawed and unfounded. New hardware features are added every generation without the regression in fps/$ we saw with Turing. What makes you think RT is special in terms of its impact on pricing? You’ve provided no evidence except to say “well A and B happened so obviously A was the reason for B”. Correlation is not causation etc etc.

I already laid out the obvious and practical reasons for Turing pricing earlier in the thread.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
866
I'll give it a chance for sure, I'm just not all-in on it. I'm not ready to declare it to be the future as some (like nVidia) are. I remain skeptical. Doesn't mean I'll hate on it. I don't really care what underlying technology games use, so long as they look good. Ultimately RT would make that the easiest on the artists, if it can be made to be performance. That's a big if, one we'll see as time goes on.
Nvidia is trying to sell graphics cards but the graphics industry embraced RT ages ago. The graphical fidelity that RT delivers isn’t a subjective thing. The science and research behind it is well established. Not to mention the results you can see with your own eyes.

What’s new is that real time gaming hardware is now catching up. Performance is key for sure but RT is an extremely parallel algorithm so throwing hardware at the problem should be easy enough.
 

ChadD

Supreme [H]ardness
Joined
Feb 8, 2016
Messages
4,634

Lakados

2[H]4U
Joined
Feb 3, 2014
Messages
2,302
That is the opposite of what is discussed here. That card has everything except Ray Tracing HW.
Well the Rumor was just a smidge off about what the add in card was that nVidia was announcing actually was. Unless they have it still kicking around on a back shelf awaiting its official debut.
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,662
Well the Rumor was just a smidge off about what the add in card was that nVidia was announcing actually was. Unless they have it still kicking around on a back shelf awaiting its official debut.
The rumor was made up nonsense, based on guesses from reading a patent, and looking at the strange cooler leak and not understanding how that worked.
 
Top