[RUMOUR] Nvidia to introduce add in cards for Ray Tracing

Marees

[H]ard|Gawd
Joined
Sep 28, 2018
Messages
2,038
if this rumor turns out to be true, we might be getting RTX Raytracing AIC cards that can enable strong raytracing performance on a non-raytracing GPU (likely only NVIDIA ones).

https://wccftech.com/nvidia-traversal-coprocessor-rtx-3090-gpu/amp/?__twitter_impression=true

source: (via Wccftech)
NVIDIA is planning to introduce a revolutionary new idea involving a traversal coprocessor and a dual-sided PCB with its RTX 3090 GPU (and family) according to a video by Coreteks.

Oops duplicate thread

https://hardforum.com/threads/coreteks-talks-traversal-coprocessors.1997824/
 
What would this product serve??? This has to be fake, I bet Jensen was hanging out in his kitchen as he does and he thought AMD’s rumour mill has been in the headlines too much these past weeks let’s steal some thunder. And BAM, he cooks this one up, has enough there to be technically plausible but no market demand to make it an actual thing. Unless this card is dirt cheap it serves no purpose other than to gut their own sales, and at a low price point there is no insensitive to make or sell it so it would be a paper launch. Unless it was a data centre or workstation part specifically to accelerate rendering applications where SLI is overkill.... that could work, but is there software for it?
 
Anyone with half a brain knows this is nonsense. Raytracing is heavily dependent on shading and any RT hardware needs to be on-chip and close to the shader core for any sort of useful performance. It even says so in the same patent that people are quoting to support this nonsense rumor.
 
Chances of an add in RT card happening are as good as me winning the lotto.
7c687debe61f4b5aca1f1465811bbd05.jpg
 
This isn't as far fetched as it sounds. The first part of this is that the RTX part of the GPU is moved off of the GPU die itself and onto a separate ICC on the PCB. This enables Nvidia to have higher yields and theoretically a cheaper die. The second part is that if they move the RTX features onto a separate ICC, then why can't they put it on an AIC? I think the first part is plausible even if the second is unlikely.
 
I feel like unless this was included in all of their chips except the bottom budget oriented chips, developers will skip adding RTX features. Physx add-in cards had this problem. No sane developer will waste resources developing features for a 1% market share unless it's a AAA title funded by nvidia.
 
PhysX is proprietary middleware implemented in a proprietary language (CUDA). DXR/Vulkan are hardware agnostic industry standard apis. There’s no comparison.
 
PhysX is proprietary middleware implemented in a proprietary language (CUDA). DXR/Vulkan are hardware agnostic industry standard apis. There’s no comparison.

We're still talking about hardware acceleration so my point stands. It's either all in or nothing. Why do you think devs code for the most common denominator? If that wasn't the case, they would all be jumping on-board to RTX and DLSS, yet the only times you see these are when companies like nvidia pay developers to implement their tech.

Until it's so ubiquitous that all hardware has access to it, it remains a niche market afforded only by a few select AAA titles.
 
We're still talking about hardware acceleration so my point stands. It's either all in or nothing. Why do you think devs code for the most common denominator? If that wasn't the case, they would all be jumping on-board to RTX and DLSS, yet the only times you see these are when companies like nvidia pay developers to implement their tech.

Until it's so ubiquitous that all hardware has access to it, it remains a niche market afforded only by a few select AAA titles.

That’s the case for any new rendering tech. Even when the hardware is on-chip it takes years to achieve wide adoption. The idea that a separate board will be required to support a standard DirectX/Vulkan feature is a joke.
 
I feel like unless this was included in all of their chips except the bottom budget oriented chips, developers will skip adding RTX features. Physx add-in cards had this problem. No sane developer will waste resources developing features for a 1% market share unless it's a AAA title funded by nvidia.

What is this 1% market share nonsense? RT HW is already here in RTX 2000 (already at 8.8% in Steam HW survey) coming soon in both new consoles, and AMD discrete GPUs as well.

Everyone with a brain knows that RT is the next rendering paradigm that is going widespread, and will soon be table stakes in AAA games.

It doesn't matter that lowest end HW won't have RT HW, these are potato cards that already need to turn off top effects anyway. Do developers skimp on ultra settings because some people run potato cards? Nope.

Developers "support" low end cards, but they don't limit effects to low end card capabilities.

That has nothing to do with nonsense RT coprocessor cards.
 
In between all the Nvidia fans losing their shit without actually reading the article, the point of the article (even if misquoted in the OP) is that there is a rumor that there will be a coprocessor on the PCB that will remove some of the RTX functions off the die and onto the coprocessor.

Assuming it can be correctly implemented, I can see this having value in increasing the yields of the GPU die (and subsequently the RTX coprocessor) and possibly allowing for more power/heat to be generated by the coprocessor resulting in higher performance and the ability to cool it efficiently.
 
In between all the Nvidia fans losing their shit without actually reading the article, the point of the article (even if misquoted in the OP) is that there is a rumor that there will be a coprocessor on the PCB that will remove some of the RTX functions off the die and onto the coprocessor.

Assuming it can be correctly implemented, I can see this having value in increasing the yields of the GPU die (and subsequently the RTX coprocessor) and possibly allowing for more power/heat to be generated by the coprocessor resulting in higher performance and the ability to cool it efficiently.

I realised Kyle had already started a thread, (on the technical aspects), after posting this. Probably I missed it in the initial scan because it didn't have nVidia in the title

The speculation of stand-alone RT AIC comes from wccftech, which makes sense for me(if 2 chips on same card, then why not 2 chips on 2 cards?), but it might be in the longer term. Not in the immediate horizon, I think.

 
Yes please, I need the next RTX GPU with seperate expansion cards for RT and also DLSS for maximum epeen. Please bring back expandable RAM chips too with their associated chip creep.
 
In between all the Nvidia fans losing their shit without actually reading the article, the point of the article (even if misquoted in the OP) is that there is a rumor that there will be a coprocessor on the PCB that will remove some of the RTX functions off the die and onto the coprocessor.

You don't need to see everything through the lens of brand fans.

From my perspective it isn't about brands, it's about BS clickbait rumors with low probability.

We get a real rumor silly season whenever it looks like a new product is on the horizon.

This looks like just another silly season rumor, based on NOTHING at all, but total speculation.
 
You don't need to see everything through the lens of brand fans.

From my perspective it isn't about brands, it's about BS clickbait rumors with low probability.

We get a real rumor silly season whenever it looks like a new product is on the horizon.

This looks like just another silly season rumor, based on NOTHING at all, but total speculation.

All the well established Nvidia fans in the forum are in this post waxing eloquent about how stupid this idea is. I figured it was a convention or something.

I wouldn't say it's based on "NOTHING" (emphasis yours). You were the one who posted the shroud with fans on two sides. It makes somewhat more sense why the fan orientation is different than the normal design if they were trying to cool two chips on one PCB.
 
All the well established Nvidia fans in the forum are in this post waxing eloquent about how stupid this idea is. I figured it was a convention or something.

I wouldn't say it's based on "NOTHING" (emphasis yours). You were the one who posted the shroud with fans on two sides. It makes somewhat more sense why the fan orientation is different than the normal design if they were trying to cool two chips on one PCB.

It worries me that you don’t think it’s stupid. You know there’s actual technical documentation out there on this stuff directly from nvidia? Not all of us get our info from YouTube wack jobs.

How does the cooler design support this rumor? You do realize the second fan blows through the heat sink with no PCB in the middle.
 
It worries me that you don’t think it’s stupid. You know there’s actual technical documentation out there on this stuff directly from nvidia? Not all of us get our info from YouTube wack jobs.

I don't read it because I don't care to. No need to be worried about poor old me. Maybe you should look in on FrgMstr who posted the other thread about the theoretical co-processor. This was the first I even heard about the theory of a coprocessor, and the shroud change doesn't come in a vacuum.
 
I Think its a good and bad idea. I love innovation, and I am glad to see Nvidia try something new (like putting the cores on the backside of the GPU). I just do not think an add in card is a good idea, I mean if AMD can do ray tracing just fine on with Navi 2/RDNA too, Nvidia should be able to as well.

I mean I am interested to see what they have planned....Innovation is great
 
I don't read it because I don't care to. No need to be worried about poor old me. Maybe you should look in on FrgMstr who posted the other thread about the theoretical co-processor. This was the first I even heard about the theory of a coprocessor, and the shroud change doesn't come in a vacuum.

That’s your choice. But don’t be mad at the people who are willing to read and educate themselves.
 
You don't need to see everything through the lens of brand fans.

From my perspective it isn't about brands, it's about BS clickbait rumors with low probability.

We get a real rumor silly season whenever it looks like a new product is on the horizon.

This looks like just another silly season rumor, based on NOTHING at all, but total speculation.

Yeah I wish any posts using wccftech as a source were auto-nuked. That site is pure garbage.
 
You don't need to see everything through the lens of brand fans.

From my perspective it isn't about brands, it's about BS clickbait rumors with low probability.

We get a real rumor silly season whenever it looks like a new product is on the horizon.

This looks like just another silly season rumor, based on NOTHING at all, but total speculation.

Nah you only call it a clickbait rumor when you dont like the rumor, otherwise you have no issue with it. It's just a rumor but your opinon is no better then anyone else.
 
Nah you only call it a clickbait rumor when you dont like the rumor, otherwise you have no issue with it. It's just a rumor but your opinon is no better then anyone else.

I call it clickbait, when it's obviously false nonsense that people should be able to see through.

My opinion is better than anyone speculating that the leaked cooler design is because of another chip on the backside of the PCB.

The leaked cooler does NOT cool the backside of the PCB, so the basic premise for this kind of house cards speculation is faulty.

The backside fan/fins only starts after the PCB ends, and the entirety of the cooler connects the front of the PCB as it always has.

The actual benefit of this kind of cooler is that you get more efficiency blowing directly through a cooler, than blowing against a cooler that blocked by a PCB.
 
If there is an RTX card add on, I would buy one if there's a baby one that's like $199. After using RTX Voice I can see where Nvidia can do some AI based stuff that awesome. (Yes, I got rtx voice working on my non rtx cards, but the 2070 is the only one it works without a problem on).
 
I'd like to see this.

I run a dedicated 750Ti just for PhysX next to my 2070 Ultra, so a PhysX and RT dedicated card next to my GPU would be great in my book. I mentioned in Kyle's thread, I'd love to see a title take advantage of this and make heavy use of RT and PhysX.
 
Back
Top