Nvidia will add anti-mining flags to the rest of its RTX 3000 GPU series going forward

I support the whole Nvidia crippling mining as I don't think Nvidia envisioned 20 years ago that their GPUs would eventually be used to make money by mining.

However, GPUs are not designed for one purpose. As a professional data scientist, I take offense to that statement!!! The 3090 rtx is definitely marketed towards data science or scientific computing applications. These applications perform similar tasks to mining.

I am curious to know if this mining limitation will affect data science applications though.
im pretty sure nvidia has a professional line that you should be using since "geforce is made for gaming"...

1625073088551.png

https://appuals.com/nvidia-promises...rformance-assures-geforce-is-made-for-gaming/
 
I support the whole Nvidia crippling mining as I don't think Nvidia envisioned 20 years ago that their GPUs would eventually be used to make money by mining.

However, GPUs are not designed for one purpose. As a professional data scientist, I take offense to that statement!!! The 3090 rtx is definitely marketed towards data science or scientific computing applications. These applications perform similar tasks to mining.

I am curious to know if this mining limitation will affect data science applications though.
GeForce has always been NVIDIA's gaming brand. The A100 and its variants were made for data crunching.
 
I support the whole Nvidia crippling mining as I don't think Nvidia envisioned 20 years ago that their GPUs would eventually be used to make money by mining.

However, GPUs are not designed for one purpose. As a professional data scientist, I take offense to that statement!!! The 3090 rtx is definitely marketed towards data science or scientific computing applications. These applications perform similar tasks to mining.

I am curious to know if this mining limitation will affect data science applications though.
Oh the irony and double standards…
Do you, by chance, make money, or benefit beyond strict gaming application with your data scientist use of your 3090?

Shame. You should be using Nvidia’s professional line!

/s
 
GeForce has always been NVIDIA's gaming brand. The A100 and its variants were made for data crunching.

While I would agree, the tensor cores and vram say other wise. AI performance and Tensor cores are in itself a data science technology. Sure, they are vital to running DLSS, but it is still a data science centered technology.

The 3090 RTX is marketed as the Titan RTX replacement I believe. The Titan RTX is a data science marketed card. Which is why I specifically mentioned the 3090 RTX. The agency that I work for, we get discounts on the 3090 RTX directly from Nvidia for use in data science.

Yes, I know that it is marketed towards gaming because thats the biggest audience. Behind the scenes, the 3090 is definitely developed with other uses in mind.
 
Oh the irony and double standards…
Do you, by chance, make money, or benefit beyond strict gaming application with your data scientist use of your 3090?

Shame. You should be using Nvidia’s professional line!

/s
Since I work for a federal agency, no we don't and can't use our cards to make money. We use it for research.
 
While I would agree, the tensor cores and vram say other wise. AI performance and Tensor cores are in itself a data science technology. Sure, they are vital to running DLSS, but it is still a data science centered technology.

The 3090 RTX is marketed as the Titan RTX replacement I believe. The Titan RTX is a data science marketed card. Which is why I specifically mentioned the 3090 RTX. The agency that I work for, we get discounts on the 3090 RTX directly from Nvidia for use in data science.

Yes, I know that it is marketed towards gaming because thats the biggest audience. Behind the scenes, the 3090 is definitely developed with other uses in mind.
It's advertised as having "TITAN class performance," not as a TITAN replacement. To me, the 3090 is effectively the successor to the TITAN RTX, but with NVIDIA's brand delineation that is not what it is "officially."
 
It's advertised as having "TITAN class performance," not as a TITAN replacement. To me, the 3090 is effectively the successor to the TITAN RTX, but with NVIDIA's brand delineation that is not what it is "officially."
The Titan's have always existed in a weird product space, and I am surprised they kept it around for as long as it did. It offered too many of the Quadro features while not offering the support so it let users make software tweaks that got its performance very close to their Quadro lineup but not quite meeting it while costing significantly more than its GeForce closest competitor for the performance it did bring. The product being rolled down into the GeForce space certainly makes sense especially since NVidia has been releasing their creator drivers alongside the gaming ones. Them killing off the Quadro series confused me because the A-series branding and marketing leaves some gaps that are still being filled by the Quadro RTX series equipment which is all last gen. But then again given the shortages if they had launched an A series set of cards that complemented the RTX Quadro series their silicon shortages would be just that much more apparent so I sort of get not launching anything there, but they haven't even leaked or announced things in that space which just seems odd to me as a whole. These are strange times we live in...
 
Them killing off the Quadro series confused me because the A-series branding and marketing leaves some gaps that are still being filled by the Quadro RTX series equipment which is all last gen. But then again given the shortages if they had launched an A series set of cards that complemented the RTX Quadro series their silicon shortages would be just that much more apparent so I sort of get not launching anything there, but they haven't even leaked or announced things in that space which just seems odd to me as a whole. These are strange times we live in...
They didnt kill the Quadro line they just dropped the name.

Quadro RTX 4000(2070) -> RTX A4000(3070)
Quadro RTX 5000(2080S) -> RTX A5000(3080)
Quadro RTX 6000(RTXTitan) -> RTX A6000(3090)
Quadro RTX 8000(RTXTitan) - This was just a RTX6000 with double the VRAM.

So what gaps in the RTX A Series are being filled with Quadro RTX series cards?
 
It's advertised as having "TITAN class performance," not as a TITAN replacement. To me, the 3090 is effectively the successor to the TITAN RTX, but with NVIDIA's brand delineation that is not what it is "officially."
^^ what he said ^^

The TITAN cards have traditionally had support for many (most?) Quadro driver-enabled features. The 3090 does not. So, while it offers the potential compute horsepower of a TITAN, it is not "the same as" a TITAN because those features have been driver-disabled.
 
*Obligatory I think mining is a giant fucking waste of resources and a scheme by China to exert (even more) control over the West*
With that out of the way...

the lite hash rate uses a secure handshake between the driver, the RTX GPU silicon, and the BIOS (firmware)
No no no. I may not agree with miners but if they purchase a GPU and own the hardware they should be able to do as they see fit. No firmware switches, no "security" bullshit, no DRM - the user owns the hardware and should be in FULL control.

There is no technical reason for this lockdown, it's completely arbitrary and everyone should be very concerned about a future where new firmware updates/drivers irreversibly remove features. Unless you pay the rental/subscription fee of course. No doubt about it, that future is coming (and no you can't replace it with older firmware/rollback - the new update modifies the (for whom exactly is all this "security", certainly not the owner of the HW) signing keys). Any guesses to when Nvidia will offer a "subscription service" to enable full hashrates again?
 
Last edited:
Since I work for a federal agency, no we don't and can't use our cards to make money. We use it for research.
You're wasting your breath. Their hatred for mining and crypto makes them blind to facts lol. "Graphics cards are made for games only! Graphics cards aren't intended for compute workloads! GeForce is for gamers only! They Terk R G.P.YOUUUUUUS!"

Compute has been an intended function of GPUs since ~2006 when CUDA was introduced by NVidia. It is well supported in the GeForce lineup.

I just hope all the whining and commotion resulting in mining blocks doesn't result in a slippery slope of al a carte features. I can see it now, $1000 for your GPU, $200 a year to unlock DLSS, $200 a year for Raytracing, $1000 a year for enhance compute performance. That seems like the logical endgame for maximum profits, and we asked for it.
 
You're wasting your breath. Their hatred for mining and crypto makes them blind to facts lol. "Graphics cards are made for games only! Graphics cards aren't intended for compute workloads! GeForce is for gamers only! They Terk R G.P.YOUUUUUUS!"

Compute has been an intended function of GPUs since ~2006 when CUDA was introduced by NVidia. It is well supported in the GeForce lineup.

I just hope all the whining and commotion resulting in mining blocks doesn't result in a slippery slope of al a carte features. I can see it now, $1000 for your GPU, $200 a year to unlock DLSS, $200 a year for Raytracing, $1000 a year for enhance compute performance. That seems like the logical endgame for maximum profits, and we asked for it.

NOOOOO BUT NVIDIA SAID!!!!!!!!!!!!
 
You're wasting your breath. Their hatred for mining and crypto makes them blind to facts lol. "Graphics cards are made for games only! Graphics cards aren't intended for compute workloads! GeForce is for gamers only! They Terk R G.P.YOUUUUUUS!"

Compute has been an intended function of GPUs since ~2006 when CUDA was introduced by NVidia. It is well supported in the GeForce lineup.

I just hope all the whining and commotion resulting in mining blocks doesn't result in a slippery slope of al a carte features. I can see it now, $1000 for your GPU, $200 a year to unlock DLSS, $200 a year for Raytracing, $1000 a year for enhance compute performance. That seems like the logical endgame for maximum profits, and we asked for it.
There has been product segmentation since NVidia started producing professional cards. Like, say, Double precision compute.

This is just adding another layer of segmentation for a different professional use. There are just angry miners that want to pay gaming cards prices and get professional level MH/s.
 
There has been product segmentation since NVidia started producing professional cards. Like, say, Double precision compute.

This is just adding another layer of segmentation for a different professional use. There are just angry miners that want to pay gaming cards prices and get professional level MH/s.
Miners aren’t angry with the way things were.
Gamers are angry.

entitled something something…
 
There has been product segmentation since NVidia started producing professional cards. Like, say, Double precision compute.

This is just adding another layer of segmentation for a different professional use. There are just angry miners that want to pay gaming cards prices and get professional level MH/s.

I can't wait until they further artificially segment out audio/video encoding/decoding out of the "gaming" lineup because it's a professional use and see how many people change their tune. I don't understand why anyone is happy about Nvidia giving you less for your money.
 
I can't wait until they further artificially segment out audio/video encoding/decoding out of the "gaming" lineup because it's a professional use and see how many people change their tune. I don't understand why anyone is happy about Nvidia giving you less for your money.

I remember a time when a there were numerous GPU manufacturers - ATI, Matrox, S3, Number Nine, 3D Labs, Rendition, NVidia, and 3dfx to name a few. Now NVidia, with its 73% market share, is on the cusp of a monopoly. They've got us by the short and curlies and it shows.
 
I can't wait until they further artificially segment out audio/video encoding/decoding out of the "gaming" lineup because it's a professional use and see how many people change their tune. I don't understand why anyone is happy about Nvidia giving you less for your money.
If NVidia thinks they can charge more for a dedicated audio / video card, great, let them. Higher profits mean more dividends for me.
 
If NVidia thinks they can charge more for a dedicated audio / video card, great, let them. Higher profits mean more dividends for me.

That’s fine. But there are also people like me who vote with our wallets. I generally don’t buy Nvidia if I don’t have to because I don't like their practices. That also affects your dividends.
 
Last edited:
I can't wait until they further artificially segment out audio/video encoding/decoding out of the "gaming" lineup because it's a professional use and see how many people change their tune. I don't understand why anyone is happy about Nvidia giving you less for your money.

I get where you're coming from, I want to be able to do everything with my GPU, and I can because I was lucky enough to be able to obtain a 3090.

But this is not the same thing. The issue is there is nothing to get for your money. Nvidia is trying to fix that by removing a feature that will cut off a huge chunk of the demand.

If supply could keep up with demand I would be angry at Nvidia, but the reality is miners are going to buy every single card and supply will never catch up unless something is done.
 
I get where you're coming from, I want to be able to do everything with my GPU, and I can because I was lucky enough to be able to obtain a 3090.

But this is not the same thing. The issue is there is nothing to get for your money. Nvidia is trying to fix that by removing a feature that will cut off a huge chunk of the demand.

If supply could keep up with demand I would be angry at Nvidia, but the reality is miners are going to buy every single card and supply will never catch up unless something is done.

You're already seeing supply and demand at work. Between China crackdowns and ETH fee structure changes and (eventually) moving to POS, this was completely unnecessary.
 
You're already seeing supply and demand at work. Between China crackdowns and ETH fee structure changes and (eventually) moving to POS, this was completely unnecessary.

Yep, if we are to believe that NVidia is cranking out twice as many cards as they ever have in their history, then it stands to reason that eventually everyone who wants one will have one. Even though we aren't at the point where you can just buy whatever you want, whenever you want for MSRP, I'm seeing the signs of this phase coming to an end. 2 Days ago, I was seeing used 2080TI's for around $1200-1300, which I thought was encouraging. Today, I found a listing for a 2080TI with EK waterblock for $900 buy-it-now. A month ago, that would have been at least $2000.
 
You're wasting your breath. Their hatred for mining and crypto makes them blind to facts lol. "Graphics cards are made for games only! Graphics cards aren't intended for compute workloads! GeForce is for gamers only! They Terk R G.P.YOUUUUUUS!"

Compute has been an intended function of GPUs since ~2006 when CUDA was introduced by NVidia. It is well supported in the GeForce lineup.

I just hope all the whining and commotion resulting in mining blocks doesn't result in a slippery slope of al a carte features. I can see it now, $1000 for your GPU, $200 a year to unlock DLSS, $200 a year for Raytracing, $1000 a year for enhance compute performance. That seems like the logical endgame for maximum profits, and we asked for it.

If it wasnt for mining most of you wouldnt even care what the compute performance was. Some of us just want a graphics card that doesn't cost over a grand
 
That’s fine. But there are also people like me who vote with our wallets. I generally don’t buy Nvidia if I don’t have to because I don't like their practices. That also affects your dividends.
Yet there you are with a 3080 in your rig. You clearly didnt care that bad.
 
Yet there you are with a 3080 in your rig. You clearly didnt care that bad.

I bought it used so Nvidia didn't get their cut :p. Actually, I need to update it because I have a 6900XT in there now.
 
While I would agree, the tensor cores and vram say other wise. AI performance and Tensor cores are in itself a data science technology. Sure, they are vital to running DLSS, but it is still a data science centered technology.

The 3090 RTX is marketed as the Titan RTX replacement I believe. The Titan RTX is a data science marketed card. Which is why I specifically mentioned the 3090 RTX. The agency that I work for, we get discounts on the 3090 RTX directly from Nvidia for use in data science.

Yes, I know that it is marketed towards gaming because thats the biggest audience. Behind the scenes, the 3090 is definitely developed with other uses in mind.
It doesn't have Titan in the name... so be careful:

I have seen a LOT of people justifying the price gap of the 3090 by buying into the "Titan replacement" or "Titan class" marketing speak of Nvidia.

This is not the case and I want to explain why and also give my opinion of why we are here now.

First let's start by clarifying what a Titan card is:

Titan cards are the biggest, baddest consumer die but without the consumer software limitations.

Some of the limitations include:
  • no access to NGX
  • artificial limitation to "non pro" drivers with shit opengl performance
  • and more importantly: artificial limitation to half-rate on the tensor cores for FP32 Accumulate (ie, the actually useful part for Deep Learning)
  • not much vram
  • some other niche features like P2P DMA etc
Usually the Titan cards are on the biggest X102 silicon with cut-down version of X102 being reserved for the Ti card. (cut down both in silicon and gimped in drivers).

With the 3090 we indeed have a big GA102 die and lots of vram. So it's titan class, right ? Well...
  • The 3090 has no access to NGX
  • The 3090 has no access to certified pro drivers
  • the 3090 is gimped to half-rate tensor operations. (confirmed here)
Performance-wise this has HUGE implications: if one looks at the Tensor flops of the 3090 you would think the 3090 is FOUR TIMES faster than the 2080 Ti, wow that's awesome ! Except:
  • Nvidia quietly quotes Tensor flops with structured sparsity, this is quite sly as this means those flops are only attainable when your matrices are 4:2 sparse (ie at least 2 zeros per 4 elements). That's never the case for training. So from the get-go the dense Tensor flops are half what nvidia quotes
  • The Tensor cores are gimped for FP16 MUL + FP32 acc to half rate in software. So yet again half the performance.

....I just hope all the whining and commotion resulting in mining blocks doesn't result in a slippery slope of al a carte features. I can see it now, $1000 for your GPU, $200 a year to unlock DLSS, $200 a year for Raytracing, $1000 a year for enhance compute performance. That seems like the logical endgame for maximum profits, and we asked for it.
Your tinfoil hat is all shiny! I have some "Stealth spray" for your hat, it's only $499 per can. Spray it on just like black spray paint. You can never be too careful...
 
Back
Top