[RUMOUR] Nvidia to introduce add in cards for Ray Tracing

That's funny, I suggested that nvidia did that, back when RTX was first introduced.

either a separate card or a separate chip.

we might end up getting both

btw that would explain why the cards are power hungry in spite of the smaller manufacturing process.
 
That's funny, I suggested that nvidia did that, back when RTX was first introduced.

either a separate card or a separate chip.

we might end up getting both

btw that would explain why the cards are power hungry in spite of the smaller manufacturing process.

Will be interesting to see if this turns out to be true, would make sense to me if this turns out to be true.
 
cool. then i'd be able to keep running my 980ti!!! and create less E-waste because everyone w/ a 2060-2080 (or whatever even 5700xt) could just upgrade their RT capabilities and do just fine prob through till PS5 pro comes out. :)
 
Which is exactly why this is a false rumor. Ray tracing is the future and people who don't want it need to be dragged kicking and screaming toward it.
Just like people didn't need shaders, AF, ambient occlusion, etc., etc. :D:D:D

I'm not yet sold on RT, but its been really long time in coming. I'll keep an eye on it and see what ampere brings to the table.
 
you mean the RTX2060? sure, no prob. :D :D

Nope like I mean its going to be hilarious that no one will see it coming. Im thinking in like a sinister but funny ironic turn of events. Just absolutely destroy nVidia. Im not fanboying - im just having fun with an possible outcome that will make people just be like ... what the fuck? No one seen that shit coming!
 
Nope like I mean its going to be hilarious that no one will see it coming. Im thinking in like a sinister but funny ironic turn of events. Just absolutely destroy nVidia. Im not fanboying - im just having fun with an possible outcome that will make people just be like ... what the fuck? No one seen that shit coming!
I have little doubt that Big Navi will at the very least match Turing in RT, but Ampere? I just don't see that happening.
The AMD ray tracing demo doesn't put Big Navi anywhere close to what nvidia has shown (heck I'd say it looks worse than the old Larrabee demos), but since is not on final hardware, I'll give it a pass.
 
What could be better then one video card worth more then the combined total of the rest of my system... why 2 cards wroth more then the combined total of the both my and the wives machines.

Part of me really wants this to happen just so I can see the... whats your problem poor don't you RTX posts on the social medias. lmao Of course its never going to happen... but the tears and the bickering would be glorious.
 
Nope like I mean its going to be hilarious that no one will see it coming. Im thinking in like a sinister but funny ironic turn of events. Just absolutely destroy nVidia. Im not fanboying - im just having fun with an possible outcome that will make people just be like ... what the fuck? No one seen that shit coming!

That would be fantastic. It would mean the lowest common denominator for RT performance will be fast enough to spur even more adoption from devs. What would suck is if AMD has a half assed implementation that discourages the use of RT.
 
He's correct. The source of the rumor isn't a leak. It's just groundless speculation. Speculation based on (willful?) ignorance.
Baseless as is it, it sure is an appealing idea. Let the people who want raytracing pay the premium, and let the rest of us have our sub-$1,000 flagship GPUs back.

Raytracing looks great, but hardware acceleration for it is going to have to mature to the point that it's present, capable, and affordable in -50 and -60 series cards. Maybe then it will reach the adoption rate with developers that its advocates pretend we're already at.
 
Baseless as is it, it sure is an appealing idea. Let the people who want raytracing pay the premium, and let the rest of us have our sub-$1,000 flagship GPUs back.

Raytracing looks great, but hardware acceleration for it is going to have to mature to the point that it's present, capable, and affordable in -50 and -60 series cards. Maybe then it will reach the adoption rate with developers that its advocates pretend we're already at.

Wishful thinking is often appealing. But I think it healthier to view the world as it is...

By the end of the year, RT HW will be integrated in all new GPUs of consequence, from both new Consoles, and new NVidia/AMD GPU discrete parts.

About the only new GPU parts lacking RT HW, will be APUs, and perhaps a new entry level potato discrete GPU, part barely above that level, though I expect not even that. For the real low end, my expectation is that they will simply rebrand older generation parts as needed, and all new discrete GPU will have some level of integrated RT HW.

The minority "won't buy RT HW" crowd is shit outa luck.
 
Its a rumor and doubt it'll happen, but it would be a good choice for someone like me. I have a 1080 ti and would like the benefits of RTX without having to get a brand new card.
 
Considering what Kyle said in the other thread, I'd be inclined to think external processing e.g. on-GPU could be a thing but add in boards with higher latency, I'm not so sure on.
AMD has an excellent interconnect on many of their GPUs too (SSG etc), so could also go this way. For now though, I'm more inclined to treat this add in card as pure rumour.
 
That would be fantastic. It would mean the lowest common denominator for RT performance will be fast enough to spur even more adoption from devs. What would suck is if AMD has a half assed implementation that discourages the use of RT.

Yeah, remember AMD's tessellation slider...to limit the games from running like the developer intended...due to AMD being far behind in tessellation-perfomance compared to NVIDIA.

/que "Too much raytracing..."
 
Baseless as is it, it sure is an appealing idea. Let the people who want raytracing pay the premium, and let the rest of us have our sub-$1,000 flagship GPUs back.

It’s a terrible idea that would only delay the inevitable. I hope you don’t actually believe raytracing is the reason the 2080Ti is $1200. That’s funny.

A flagship GPU in 2020 must have hardware accelerated raytracing otherwise by definition it’s not a flagship. Flagship products lead the way and that is how we make progress.

It was the same in 2001 where a flagship had to have hardware accelerated vertex transform and lighting even though no games used it. This is what Anandtech said back then.

As we mentioned in our 'NV20' Revealed article, the GeForce3's performance superiority in current games will only lie at high resolutions (higher than 1024 x 768 x 32) or when enabling its Quincunx Anti Aliasing. In many ways, the GeForce3 would have paralleled the Pentium 4's launch in that the current crop of benchmarks (in this case, games) would not have shown any performance increase that's worth the money.
 
Wishful thinking is often appealing. But I think it healthier to view the world as it is...
That's exceptionally rich coming from you, given this comment:
The minority "won't buy RT HW" crowd is shit outa luck.
... Whereas the most popular RTX enabled card sits at 2.38% per the May Steam Hardware Survey. Nearly every 10-series (and 16-series) card is higher positioned on the chart.

Meanwhile, where you and I buy, the 1080ti sits at 1.6%, whereas the 2080ti still only claims 0.83% of the market, nearly two years after launch.

Yep, that "won't buy RTX hardware" crowd is pretty much a statistical anomaly, it's so dang small... /s
 
That's exceptionally rich coming from you, given this comment:

... Whereas the most popular RTX enabled card sits at 2.38% per the May Steam Hardware Survey. Nearly every 10-series (and 16-series) card is higher positioned on the chart.

Meanwhile, where you and I buy, the 1080ti sits at 1.6%, whereas the 2080ti still only claims 0.83% of the market, nearly two years after launch.

Yep, that "won't buy RTX hardware" crowd is pretty much a statistical anomaly, it's so dang small... /s


Woosh! The point went over your head.

The point is that after the end of this year, all new GPUs will have RT HW. There won't be any new Non-RT GPUs to choose. Do you dispute this?

Also LOL at your broken logic. How are RT cards supposed to suddenly dominate the survey that includes cards purchased 5+ years ago. Go back in time?

How about you confine the survey results, to cards that came out starting in Fall 2018 when RTX was introduces, and see how many people chose RTX vs Non-RTX new cards.

That would be Navi vs Turning. Tell me how does that result look?
 
That's exceptionally rich coming from you, given this comment:

... Whereas the most popular RTX enabled card sits at 2.38% per the May Steam Hardware Survey. Nearly every 10-series (and 16-series) card is higher positioned on the chart.

Meanwhile, where you and I buy, the 1080ti sits at 1.6%, whereas the 2080ti still only claims 0.83% of the market, nearly two years after launch.

Yep, that "won't buy RTX hardware" crowd is pretty much a statistical anomaly, it's so dang small... /s

Try doing a total...you know math done correctly.
Other I cloud use the market share on Steam of Vega to claim AMD's marketshare is less than 3%.

But that would be a retarded fallacy claim.
Try again...using REAL math...and then factor in that Pascal was a performane ANOMALY...but I have a feeling that you don't like facts...
 
So... what's the reason?

It’s the fastest card available period and people are willing to pay $1200 for that. If nvidia thought they could make more money selling Ti’s for <$1000 that’s what they would do.

Look at it this way. Why would Nvidia sell a 2080 Ti for less just because it doesn’t have RT? That would be charitable but not very smart. It’s not like the lack of RT would have changed the fact that it’s the fastest card available and has no competition.
 
Woosh! The point went over your head.

The point is that after the end of this year, all new GPUs will have RT HW. There won't be any new Non-RT GPUs to choose. Do you dispute this?

Also LOL at your broken logic. How are RT cards supposed to suddenly dominate the survey that includes cards purchased 5+ years ago. Go back in time?

How about you confine the survey results, to cards that came out starting in Fall 2018 when RTX was introduces, and see how many people chose RTX vs Non-RTX new cards.

That would be Navi vs Turning. Tell me how does that result look?
Broken logic? I don't think so. I'm looking at current market share, which by definition shows how many people have and how many people have not bought RTX cards as of today. You can put those goalposts down, too: my counterpoint was only to your assertion that the "won't buy RTX HW" crowd is a minority. That's ridiculous and patently false. If anything, the people who have bought RTX cards are in a minority.

Obviously, of the new card sales since RTX launch, most of them are going to be RTX cards. That's a no brainer - nobody buys last gen new unless they're bargain hunting right after a current gen launch. But that slice of the data ignores the people who might have upgraded, but looked at the RTX lineup and did not see the value in upgrading. Obviously no survey will directly tell us this, but it's a pretty obvious conclusion given the massive price/performance failings in the 20-series compared with previous generations.

To your first point, no - I don't dispute that all new GPUs are likely to have RTX hardware from the upcoming gen onward. If the 2060's "RTX ON" benchmarks are any indication, though, I'm not terribly excited about it either. RTX will be in all the cards - despite not being capable, affordable, or developer-adopted enough to warrant its presence.
 
I mean if it’s a huge card that’s 500 bucks it kinda makes sense. I think it’ll fail obviously but they could add far more RT cores if they ignore all the rastorizarion stuff. You could play Quake 2 RTX with a Riva TNT2 Ultra at Max RT settings lmao.

I’m seriousness these rumors are probably just to get RT more pronounced even if NVidia takes a hit.
 
2060 actually didn't seem terribly bad with dlss2.0 in control.



Problem with DLSS 2.0 is only a tiny handful of games use it right now. If it's use spreads then maybe it will be more important, but it does seem to work well.
 
Broken logic? I don't think so. I'm looking at current market share, which by definition shows how many people have and how many people have not bought RTX cards as of today.

That makes no sense. Are you fooling anyone other than yourself by pretending that people who bought cards before any RT existed, are part of the "Won't buy RT HW" crowd?

"Haven't bought RT HW yet", is not the same as "Won't buy RT".
 
"Haven't bought RT HW yet", is not the same as "Won't buy RT".
Yes. I just said that. No, the two are not the same, but the latter is absolutely a subset of the former. Are you fooling anyone other than yourself by insisting that the sticker shock of a 2080ti wouldn't dissuade anyone from upgrading where they otherwise might have?

If the 2080ti had launched at $699, I'd more than likely own one. Heck knows my 3440*1440 @144hz would make use of it. But you can't expect me to believe that the first product to debut first-in-class, dedicated real-time ray tracing silicon just coincidentally happens to cost almost twice as much as the previous gen and that the two facts aren't related. That's ridiculous.

Raytracing is cool. I will pay for it someday. I just don't want to pay the early-adopter's tax. The number of games I really want to play that utilize RTX wouldn't take one hand to count, and it's not as though I couldn't play those games without RT anyway.
 
Last edited:
Yes. I just said that. No, the two are not the same, but the latter is absolutely a subset of the former.

Irrelevant. You don't get to pretend the latter, just because its contained within the former.

Are you fooling anyone other than yourself by insisting that the sticker shock of a 2080ti wouldn't dissuade anyone from upgrading where they otherwise might have?

No, because I never said, nor even implied any such thing. That's a pathetic attempt at a strawman.

Lets compare RTX Turing to Navi cards from Steam HW survey. This might give some indication of what happens when there is a choice.

RTX Turing Cards: ~8%, Navi Cards: ~1%

That's actual choices at play. :D
 
Last edited:
Irrelevant. You don't get to pretend the latter, just because its contained within the former.



No, because I never said, nor even implied any such thing. That's a pathetic attempt at a strawman.

Lets compare RTX Turing to Navi cards from Steam HW survey. This might give some indication of what happens when there is a choice.

RTX Turing Cards: ~8%, Navi Cards: ~1%

That's actual choices at play. :D
False dichotomy. Choosing not to upgrade this gen is also a choice. It's just not one that the Steam HW survey shows explicitly.

Also, I never said that "everyone still using a 10-series has actively chosen not to buy the 20-series," but I know for a fact that some people have. Common sense bears out the conclusion.
 
That makes no sense. Are you fooling anyone other than yourself by pretending that people who bought cards before any RT existed, are part of the "Won't buy RT HW" crowd?

"Haven't bought RT HW yet", is not the same as "Won't buy RT".

It's two sides of the same coin. "Won't buy RT" is actually, "I won't pay that much for a feature I won't use." And "Haven't bought RT HW yet" is "I guess this lower priced card has RT included, so I'll take it even if it's not powerful enough to do anything."
 
Perhaps ray tracing will be(is) implemented differently on ps5/xbx to bypass the need for dlss and therefore not being as resource intensive. That will open gates for ray tracing to general public.

Digital Foundry has some analysis of PS5 Ray Tracing in the demos. They appear to be relying on many of the same optimizations, that current RTX games use, so it doesn't look like there is any breakthrough on RT HW peformance, nor anything that makes RT less resource intensive:
 
Lets compare RTX Turing to Navi cards from Steam HW survey. This might give some indication of what happens when there is a choice.

RTX Turing Cards: ~8%, Navi Cards: ~1%

That's actual choices at play. :D

RTX cards are 8.8% and yet the 1060 by itself is 11.80%. Your right the market did speak they decided to not upgrade at the current cost of new cards. Also Pascal is 34.79% as per steam hardware survey which shows just how loudly people think it's not worth upgrading, so yes their choice was clear.
 
Last edited:
RTX cards are 8.8% and yet the 1060 by itself is 11.80%. Your right the market did speak they decided to not upgrade as the current cost of new cards. Also Pascal is 34.79% as per steam hardware survey which shows just how loudly people think it's not worth upgrading, so yes their choice was clear.

And Quad Cores are almost 50% of the Survey, so Ryzen must be kind of irrelevant failure, right?

You have to recognize that not everyone runs out upgrade annually, nor even every two years, and when people upgrade, what do what with their old cards? Burn them? As long as cards are usable, they will stay in circulation. Pascal was NVidia biggest success stories ever, they will be in circulation for a long time. I don't think anyone is arguing that Turing was a bigger success than Pascal.

But 8.8% is not bad for RTX cards given how expensive the RTX cards are, while Navi cards have yet to reach 1%.
 
Digital Foundry has some analysis of PS5 Ray Tracing in the demos. They appear to be relying on many of the same optimizations, that current RTX games use, so it doesn't look like there is any breakthrough on RT HW peformance, nor anything that makes RT less resource intensive:

I mean, ray tracing is old as fuck, there really isn't going to be any magic optimizations to make it suddenly fast. It is an algorithm from the early days of computers, we've had a lot of time to work on it. Also its overall simplicity (CS students often write a ray tracer as a school project) is one of the things that makes it cool, but also means that big optimizations aren't really in the cards. It is doing the same simple thing, over and over and over.

The speedups we see and will continue to see are going to be brute force, lots of hardware dedicated to doing it fast. That will take time, as there's only so much you can cram on to a chip. It also means that we aren't likely to see AMD or nVidia or Intel magically pull way ahead, performance will be limited by how much silicon they can throw at it.
 
And Quad Cores are almost 50% of the Survey, so Ryzen must be kind of irrelevant failure, right?

You have to recognize that not everyone runs out upgrade annually, nor even every two years, and when people upgrade, what do what with their old cards? Burn them? As long as cards are usable, they will stay in circulation. Pascal was NVidia biggest success stories ever, they will be in circulation for a long time. I don't think anyone is arguing that Turing was a bigger success than Pascal.

But 8.8% is not bad for RTX cards given how expensive the RTX cards are, while Navi cards have yet to reach 1%.

Cpu power is based on need, if a quad core is all you need then thats all you buy, if you need more cores then buy appropriately as more cores only works if you got the software to use it. Your comparison is nothing like what were discussing as it relates to a new gpu feature that came with a massive price increase and Ray Tracing performance leaves a lot to be desired as well. Also plenty people are still on much older hardware then even Pascal and that still passed on the increased price of current cards be it AMD or Nvidia.

8.8% sucks if your trying to push new tech to software companies, no company will cater to that tiny market without cash under the table. Navi is also just as highly priced so I dont expect it to sell well either. Also you need to factor in that 2.38% of that 8.8% is a 2060 then yeah they are not doing much if any Ray Tracing. Under performing and over priced has been the moto of this generation of cards, I am hoping it improves with the coming generation or you better get used to Consoles being the dominant market again for gaming.
 
Back
Top