So what's the betting RTX cores are actually mining cores?

spine

2[H]4U
Joined
Feb 4, 2003
Messages
2,715
I was surprised when nvidia didn't seem to actively go after mining last year. Surely they could have whipped up some basic asic or something, but no.

I'm sure the RTX cores absolutely do their job as Ray tracing accelerators, but that isn't going to help gamers any time soon. Maybe a year from now some dude will get a Ray Traced modded Bioshock running on two RTX Titans via nvlink, but realistically, I can't see any application outside of development.

I get that nvidia need to prime the market with hardware ahead of actually pushing Ray Traced games, but given the cost of these chips on the node they are, it would have made alot more sense to shave off a 3rd of the die space if they're targeting gamers. Well I guess they have with 2060. The 2080/2070 must be cut down Quadro rejects, so just happens to have that RTX core. I think AdoredTV went over this recently with a supposed internal leak.

Seems an odd move to me though, so I'm fully expecting another mining explosion based around Turing. I mean it *is* called Turing after all! :p
 
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?

Or... just make a dedicated processor for crypto mining altogether. That seems like a missed opportunity for silicon chip manufactures.
 
Last edited:
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?

Or... just make a dedicated processor for crypto mining altogether. That seems like a missed opportunity for silicon chip manufacteres.
So you'd also handicap the card for high performance compute workloads? I doubt universities would appreciate their cards being nerfed. Same goes for Pixar or whoever else is trying to use them to render huge workloads in a centralized fashion.
 
As much as I don't like Cryptology has really f'ed up the casual gamer market (which I'm Casual as all get out :)), I don't think it's right to lock people out just because of the inconvenience. It's a sucky situation, but the fix is probably increased volume which will saturate the market, bring down prices, and also make the used market that much nicer. Unfortunately increasing production is a major gamble also, and depends on current yield. So it's a ton of shiate the whole situation is in.

Nvidia and Amd should release "Partner" cards to miners :) Direct buy cards where a percent of the mining goes to the Company :)
 
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?

Or... just make a dedicated processor for crypto mining altogether. That seems like a missed opportunity for silicon chip manufacteres.

Seems to me the best plan for nvidia is to use the gaming market to inflate the prices of mining capable hardware, and then flog that on mass to the Pro miners round the world.

Forcing miners to buy expensive cards that game and gamers to buy inflated priced cards that also mine. Both pay more! Genius.
 
I was surprised when nvidia didn't seem to actively go after mining last year. Surely they could have whipped up some basic asic or something, but no.

I'm sure the RTX cores absolutely do their job as Ray tracing accelerators, but that isn't going to help gamers any time soon. Maybe a year from now some dude will get a Ray Traced modded Bioshock running on two RTX Titans via nvlink, but realistically, I can't see any application outside of development.

I get that nvidia need to prime the market with hardware ahead of actually pushing Ray Traced games, but given the cost of these chips on the node they are, it would have made alot more sense to shave off a 3rd of the die space if they're targeting gamers. Well I guess they have with 2060. The 2080/2070 must be cut down Quadro rejects, so just happens to have that RTX core. I think AdoredTV went over this recently with a supposed internal leak.

Seems an odd move to me though, so I'm fully expecting another mining explosion based around Turing. I mean it *is* called Turing after all! :p

You seem to fail to understand that both gaming and mining is compute in the CUDA cores.
This is NOT an odd move.
This is a LONG planned graphics move.

Drop the tinfoil-hat FUD.
 
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?.

As someone who dislikes what mining has done to the consumer gaming market (for GPUs), and would never buy a used mining card (despite anyone who says "WAT, NONSENSE, IT'S FINE, BUY MY USED MINING CARD!") - your argument there is silly and comes down to "Seeing as I can't have ice cream, no one should have ice cream!"
 
Last edited:
As someone who dislikes what mining has done to the consumer gaming market (for GPUs), and would never buy a used mining card (despite anyone who says "WAT, NONSENSE, IT'S FINE, BUY MY USED MINING CARD!") - your argument there is silly and comes down to "Seeing as I can't have ice cream, no one should have ice cream!"

I'm not arguing or even debating for that matter. Just giving ideas and I don't understand why some here lost their shit over them, yeesh. Anyways, I have yet to see many other ideas that could help gamers like myself deal with crypto miners ruining our hobby.
 
Mining Cores? Keep on smoking it.
Well technically GPU cores are mining cores, so...

I wonder if tensor cores could be used for mining.

BTW I think TitanV could do like 90mh/s on ETH, so if Turing comes anywhere near that, miners will be all over them.
 
Well technically GPU cores are mining cores, so...

I wonder if tensor cores could be used for mining.

BTW I think TitanV could do like 90mh/s on ETH, so if Turing comes anywhere near that, miners will be all over them.

Not really arguing with you.
 
I'm not arguing or even debating for that matter. Just giving ideas and I don't understand why some here lost their shit over them, yeesh. Anyways, I have yet to see many other ideas that could help gamers like myself deal with crypto miners ruining our hobby.

Here's my advice : git gud at buying cards before miners. As a gamer, you should understand that concept. The miners also benefit us in a way by generating more revenue for nvidia (more cards sold/shipped - even if not at higher prices like third party sellers/store fronts benefit from) - which leads to more money for R&D and such, which leads to better products down the road. There's no reason to try and keep miners out for nvidia. A card sold is a card sold. So your suggestion there literally came from a place of selfishly not wanting anyone else to be allowed ice cream because you couldn't get any.
 
Last edited:
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?

Or... just make a dedicated processor for crypto mining altogether. That seems like a missed opportunity for silicon chip manufactures.

Works until you put a dongle on that simulates vidoe output then what.
you have to realise the unit is not in control of nvida once it is in the hands of the user
 
Can't AMD and Nvidia attempt to lock out their cards from crypto by disabling or handicapping them if no video output is detected?

Or... just make a dedicated processor for crypto mining altogether. That seems like a missed opportunity for silicon chip manufactures.

I think they could via drivers. I mean pascal sucked bigtime at mining when it first came out. It took several driver revisions for it to be good for mining.
 
Here's my advice : git gud at buying cards before miners. As a gamer, you should understand that concept. The miners also benefit us in a way by generating more revenue for nvidia (more cards sold/shipped - even if not at higher prices like third party sellers/store fronts benefit from) - which leads to more money for R&D and such, which leads to better products down the road. There's no reason to try and keep miners out for nvidia. A card sold is a card sold. So your suggestion there literally came from a place of selfishly not wanting anyone else to be allowed ice cream because you couldn't get any.

Ice cream for those that like cream. Crypto folk can eat the non-dairy variety ;)
 
Here's my advice : git gud at buying cards before miners. As a gamer, you should understand that concept. The miners also benefit us in a way by generating more revenue for nvidia (more cards sold/shipped - even if not at higher prices like third party sellers/store fronts benefit from) - which leads to more money for R&D and such, which leads to better products down the road. There's no reason to try and keep miners out for nvidia. A card sold is a card sold. So your suggestion there literally came from a place of selfishly not wanting anyone else to be allowed ice cream because you couldn't get any.

I couldn't agree more with this statement. Look at AMD. They sold each and every single card and and even they had a great year. It might serve them well at the end where they can invest that money back in R&D. I was seriously worried for polaris and vega and thought it would result bad. But with Ryzen selling well and Vega and Polaris selling well because of mining it probably generated some good money that will go back in R&D.
 
You seem to fail to understand that both gaming and mining is compute in the CUDA cores.
This is NOT an odd move.
This is a LONG planned graphics move.

Drop the tinfoil-hat FUD.

Ok, keep calm! I was just ignorantly speculating that I'd expect these new cores to be exploited by mining in some fashion. Time will tell.

You confidently think otherwise, and I hope you're right.

The oddity to me is nvidia basically doing a Fermi again. Those RTX cards are gunna be very expensive aren't they...
 
Last edited:
Ok, keep calm! I was just ignorantly speculating that I'd expect these new cores to be exploited by mining in some fashion. Time will tell.

You confidently think otherwise, and I hope you're right.

The oddity to me is nvidia basically doing a Fermi again. Those RTX cards are gunna be very expensive aren't they...

Ignorance is the worst foundation...it inspire other muppets to run with it like it is gospel...Dunning-Kruger effect in action. Not funny.

NVIDIA is not doing a "Fermi" here, they are doing a "G80".
G80 was the last BIG change in NVIDIA graphics (Going from Pixel, Vertex & Geometry pipe-line to Unified cores (CUDA)).
It been their backbone ever since.
Now they are pushing for Raytracing (Hence the change from GTX to RTX in naming).
Raytracing has been THE "holy grail" since graphics were "invented"...everything up until then has been trying so "simultate" Raytracing.

And if you have the GAMING RTX prices please do share (and then add in inflation and compare with the G80 GPU MSRP)...otherwise please stop posting from ignorance...it's boring, annoying to have to refute..and a waste of everyones time (aka life) as you goes from one retarded comment to the next without brakes.
 
Ignorance is the worst foundation...it inspire other muppets to run with it like it is gospel...Dunning-Kruger effect in action. Not funny.

NVIDIA is not doing a "Fermi" here, they are doing a "G80".
G80 was the last BIG change in NVIDIA graphics (Going from Pixel, Vertex & Geometry pipe-line to Unified cores (CUDA)).
It been their backbone ever since.
Now they are pushing for Raytracing (Hence the change from GTX to RTX in naming).
Raytracing has been THE "holy grail" since graphics were "invented"...everything up until then has been trying so "simultate" Raytracing.

And if you have the GAMING RTX prices please do share (and then add in inflation and compare with the G80 GPU MSRP)...otherwise please stop posting from ignorance...it's boring, annoying to have to refute..and a waste of everyones time (aka life) as you goes from one retarded comment to the next without brakes.

Ignorance is bliss. :p

Honestly I dunno why you're getting so wound up, speculation is fun, and that's all we have till Monday...
 
Most people I have met suffering from the Dunning-Kruger effect are not happy.
They are people very angry at the world...and facts ;)

And yet, ironically, you've been a total bundle of joy today!

You must be having a bad one....

In any case, it's not Dunning–Kruger effect if you're aware of your own ignorance and embrace it, which we all are concerning the new RTX cores. I'm just speculating/expecting these cores to be used for something other than Ray tracing in the near future.
 
And yet, ironically, you've been a total bundle of joy today!

You must be having a bad one....

In any case, it's not Dunning–Kruger effect if you're aware of your own ignorance and embrace it, which we all are concerning the new RTX cores. I'm just speculating/expecting these cores to be used for something other than Ray tracing in the near future.

Should I record my bursts of laughter when reading a retarded post? ;)
Just because I don't hold your hand and sugar-coat my words doesn't make me angry.
But experience has taught me that nice words do nothing against ignorance.
If they have a possible senario other than Raytracing...NVIDIA would not have named them "RT Cores" (RayTracing Cores)
Look at the other naming of their cores:

CUDA Cores = Compute Unified Device Architecture Cores (Their broad multi-use compute cores)
TENSOR Cores = For doing TENSOR operations: https://en.wikipedia.org/wiki/Tensor

So what are you basis for your OP....if not "ignorance"? ;)
 
giphy.gif
 
The miners also benefit us in a way by generating more revenue for nvidia (more cards sold/shipped - even if not at higher prices like third party sellers/store fronts benefit from) - which leads to more money for R&D and such, which leads to better products down the road.

That's a wonderful theory, until you realize just how long Pascal has been the "current" generation GPU. You're getting the higher prices, but you aren't getting better products any sooner. A benefit for nVidia for sure. They're selling what are not high yield parts at a premium. I'm not seeing a benefit for the end user though.
 
While I agree with the general point you made on your post, please don’t use retarded as an insult. It’s so unnecessarily offensive to disabled people.

I don't belong to the "PC bridgade", so that would be a no.
 
https://www.dictionary.com/browse/retarded

Noun, 4b

And I was reffering to his arguments, not his person...SJW/PC brigades bore me.

I wasn't offended, and you may this unbelievable, but I agree with you, "retarded" is a perfectly fine word in the right context.

Factum said:
please stop posting from ignorance...it's boring, annoying to have to refute..and a waste of everyones time (aka life) as you goes from one retarded comment to the next without brakes.

You have to refute it and it annoys you? No one forced you into this thread man.

So what are you basis for your OP....if not "ignorance"? ;)

That's exactly where I'm coming from! And I do believe I stated as such. I hardly know anything about these new RTX cores. :wacky:

But please go on, get the last word in, it's cool. :cool:
 
I wasn't offended, and you may this unbelievable, but I agree with you, "retarded" is a perfectly fine word in the right context.



You have to refute it and it annoys you? No one forced you into this thread man.



That's exactly where I'm coming from! And I do believe I stated as such. I hardly know anything about these new RTX cores. :wacky:

But please go on, get the last word in, it's cool. :cool:

The RT cores are in fact VERY specialized (ASIC), very hard to find a use for them outside their VERY specific use they are designed for.
They don't do RayTracing on their own, they access Streaming Multiprocessors for Ray and Triangle Intersection, use the Bounding Volume Hierarchy for Buffer Access and utiliize the TENSOR cores for AI Inferencing assiting in the final RayTracing image.

So SPECIALIZED hardware using other parts of the SKU for their task, not the other way around

"Speculation" enough for you?

(You might want to google ASIC) ;)
 
The RT cores are in fact VERY specialized (ASIC), very hard to find a use for them outside their VERY specific use they are designed for.
They don't do RayTracing on their own, they access Streaming Multiprocessors for Ray and Triangle Intersection, use the Bounding Volume Hierarchy for Buffer Access and utiliize the TENSOR cores for AI Inferencing assiting in the final RayTracing image.

So SPECIALIZED hardware using other parts of the SKU for their task, not the other way around

"Speculation" enough for you?

(You might want to google ASIC) ;)

Honestly, thanks for the clarification.

So with the Tensor cores and RTX cores necessarily being specialised, that's 2/3rds of the die that can't immediately used in today's games. For a card aimed at gaming. I just find that odd given how expensive these chips must be to produce, hence why I speculated the RTX cards would be very expensive compared to the GTXs.

At least with Fermi, the Tessellation stuff could be used in actual games immediately.
 
Calm down guys, crypto has already moved to dedicated hardware hence the crypto revenue crash for nvidia (dropped 90% from Q1 to Q2).
 
Honestly, thanks for the clarification.

So with the Tensor cores and RTX cores necessarily being specialised, that's 2/3rds of the die that can't immediately used in today's games. For a card aimed at gaming. I just find that odd given how expensive these chips must be to produce, hence why I speculated the RTX cards would be very expensive compared to the GTXs.

At least with Fermi, the Tessellation stuff could be used in actual games immediately.

JUST stop...please.
Please document that TENSOR cores and RT Cores take up 2/3rds of the die....this is what I mean about posting from ignorance. *sigh*
 
JUST stop...please.
Please document that TENSOR cores and RT Cores take up 2/3rds of the die....this is what I mean about posting from ignorance. *sigh*

Sorry, my mistake, I meant a 1/3rd, as in:

Winding up Factum who always has get the last word in and can't let himself ignore me either lol.png


Ok, so maybe more like a quarter, potatoes patatas.

But sure, I'll just stop right there, cheers for sorting all this out. :muted:
 
So you'd also handicap the card for high performance compute workloads? I doubt universities would appreciate their cards being nerfed. Same goes for Pixar or whoever else is trying to use them to render huge workloads in a centralized fashion.
I can't speak for every univeristy, but the one I work for would buy Teslas for HPC tasks. If you're buying GeForce cards for HPC purposes, you're small time.

They could easily do some sort of BIOS thing that handicaps on GeForce cards if no display were detected, but I have a feeling that would just prompt the miners to come up with a display faker dongle or something.
 
I was surprised when nvidia didn't seem to actively go after mining last year. Surely they could have whipped up some basic asic or something, but no.

I'm sure the RTX cores absolutely do their job as Ray tracing accelerators, but that isn't going to help gamers any time soon. Maybe a year from now some dude will get a Ray Traced modded Bioshock running on two RTX Titans via nvlink, but realistically, I can't see any application outside of development.

I get that nvidia need to prime the market with hardware ahead of actually pushing Ray Traced games, but given the cost of these chips on the node they are, it would have made alot more sense to shave off a 3rd of the die space if they're targeting gamers. Well I guess they have with 2060. The 2080/2070 must be cut down Quadro rejects, so just happens to have that RTX core. I think AdoredTV went over this recently with a supposed internal leak.

Seems an odd move to me though, so I'm fully expecting another mining explosion based around Turing. I mean it *is* called Turing after all! :p

They're smart. They realize that mining is a stupid fad based on nothing, and it's foolish/risky to invest a lot into something that you know is based on nothing.

Mining has always been a dumb scam.
 
Back
Top