NVIDIA's Secret GPU: TU106-400A vs. TU106-400 Benchmark

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Gamers Nexus published a piece yesterday that is worth a watch for those who are contemplating an RTX 2070, as binning is in play: overclocked and other premium editions come with a better chip, the TU106-400A, while cheaper models come with a lesser variant. Steve found some distinct differences in the quality of silicon and cooler but advises the premium may only be worth it for those who truly desire lower noise or a higher OC providing an extra 5-12% performance.

This review ultimately covers two key items: (1) The review of the EVGA RTX 2070 XC Ultra vs. the 2070 Black, establishing the differences between cards priced $70 apart (at time of posting); (2) the emergent differences between the TU106-400A and TU106-400 GPUs. The two go somewhat hand-in-hand. The clock difference is somewhat enormous when looking between the two, at more than 100MHz faster on the higher-end model. This isn’t just the PCB and VRM – it barely is, in fact – it’s the boosting behavior from the silicon quality and the cooler quality.
 
We're seeing more and more of this. Overclocking is largely dead. Look at the new i9 9900K. How much higher can you overclock it? 200mhz? And then the thing runs at 95C, fails, and hits a thermal wall.

If you want more performance in this day and age, you have to pay for it, unfortunately.

Also the reports of ~90C temps on the GDDR6 memory on these cards would give me pause when overclocking the VRAM.
 
Have Nvidia always done pre-sorted 'A' cards or was this intended for compliant partners? *cough*GPP*cough*
 
They woke up and are binning the hell out of everything when it comes to chips....

Chipmakers realized what is going in and invented ultra high end for those "addicted" to MORE PERFORMANC no matter what the cost...
 
So this would explain the 2080ti lower priced part numbers starting to emerge...
 
  • Like
Reactions: N4CR
like this
I find it annoying when you as the consumer can't see these things on the front.

Selling different products under the same name with different specifications ought to be illegal.

Based on this knowedge, the first thing I'd do after receiving a GPU is to run GPU-z on it to find the exact chip, and return it if it isn't the better one.
 
No.

Maybe binned parts but this seems a bit different.

It seems to me a really shitty thing for Nvidia to do. The AIB partners aren't going to want to publicize which version they're using because it'd affect sales of the 'cheaper' versions they might sell or that they only use the non-A versions. If they wanted to be really scummy they could start with A versions then once most reviewers have done their work, switch to the non-A chips.

Does anyone know if GPU-Z shows which version you have?
 
So this would explain the 2080ti lower priced part numbers starting to emerge...

You're paying for your card to be pre-binned. You want the good clocking card that runs cooler and has better boosts, you pay an extra $400 for it on top of the already ridiculous $1000 for the base model poorly binned chip.

I'm pretty sure the TU102-300"A" is already in existence in the released cards. Just a matter of time before they release the "non-A" cards for cheaper that don't boost as high.
 
Moore's law is dead.

This is the "Rape of the Bins" effect.

Used to, there was a bell curve of parts, and after it was "in manufacturing", only the upper 20% or so was pulled for "high end cards".

Now, to get the advertised speeds, they are Only using the upper ~20% of the curve, and the only diff is a few percent between the bleeding edge, and the standard product.

So far, it looks like it's hitting both intel and Nvidia hard, as they're "going Soft from where I'm standing. :LOL:

I haven't seen AMD pulling out the megacooler yet, so there is Some hope, but I don't really thing it will be too long; certainly within a couple of years at the most.

How much frequency increase are we seeing from a waterblock upgrade? Used to that was worth ~25% at least.

Truthfully, I wonder if intel has been "turning off stuff" for lower end chips, of if it really WAS bad on the die.

That would explain a lot.

I've ran a 1.3GHz overclock on my 3930k for years, 3.2Ghz base vs 4.5GHz full time OC. Adding a waterblock didn't give me anymore speed, just a quieter rig.



('memberberries voice:) 'member when you could bios flash a video card, and Might be able to make it a better card?

How long has that been? Last one that worked for me was an x800...

EDIT: what causes an overstrike? I tried to put brackets around 'soft' above and it overstruck everything from there, lol. is it shit test

OK, I understand. :)
 
Well stated they were binning their chipsets. Lower quality components are being used by video card companies. This is not their faults but Nvidia manipulating the market again. They WILL make their money because they Control the market.
 
I find it annoying when you as the consumer can't see these things on the front.

Selling different products under the same name with different specifications ought to be illegal.

Based on this knowedge, the first thing I'd do after receiving a GPU is to run GPU-z on it to find the exact chip, and return it if it isn't the better one.

This makes, literally, no sense and has no precedent. If it meets its advertised specs the fact that some could do more and some couldn't is irrelevant. We, as enthusiasts, have lived on this for decades. We find the extra potential, we decipher the variances, we polish the diamonds in the rough, we reject the polished turds. Where is the enthusiast if we know it will do x speed and over clock to x speed guaranteed?
 
I would think this is pretty easy for them to do. AFAIK the chips closest to the center of the wafer clock the highest on the lowest voltage.
 
This makes, literally, no sense and has no precedent. If it meets its advertised specs the fact that some could do more and some couldn't is irrelevant. We, as enthusiasts, have lived on this for decades. We find the extra potential, we decipher the variances, we polish the diamonds in the rough, we reject the polished turds. Where is the enthusiast if we know it will do x speed and over clock to x speed guaranteed?

I have no problem with them binning their parts.

If they do and they create different part numbers out of it, I think that should be printed on the box and in the specs so I know what I'm buying.

That's all I'm asking. To know what part number I am buying.
 
I have no problem with them binning their parts.

If they do and they create different part numbers out of it, I think that should be printed on the box and in the specs so I know what I'm buying.

That's all I'm asking. To know what part number I am buying.

You do though. You're getting a product that meets the 2070 spec. So, what is your complaint?
 
If anyone finds a 2080ti $999 model, they should do some GPU-Z checks to see if a different chip shows up in it or something.
 
You are buying a 2070 spec graphics card. It makes spec. You got what you bought.


I'm ok with varying silicon quality if it has not been intentionally manipulated.

I know it's th esilicon lottery.

If somoene is manipulating it - however - and putting it into different categories, I want to know what I am buying.

Essentially, everything they know should be shared with the customer. If they don't know, they don't have to tell me, but if they do, I should be informed.
 
I'm ok with varying silicon quality if it has not been intentionally manipulated.

I know it's th esilicon lottery.

If somoene is manipulating it - however - and putting it into different categories, I want to know what I am buying.

Essentially, everything they know should be shared with the customer. If they don't know, they don't have to tell me, but if they do, I should be informed.

You are buying a 2070 spec graphics card. It makes spec. You got what you bought. There are no different categories. It's a good thing you weren't around in the ATI Radeon 9500 Days for instance....because your head would have exploded.
 
Essentially, everything they know should be shared with the customer. If they don't know, they don't have to tell me, but if they do, I should be informed.

Problem is anything they shared with customer would be interpreted by customer as a hard guarantee -- there's no room for grey area. So if they stated "testing found this GPU runs beyond spec 99% of the time, but may crash in certain scenarios" - well customer will bitch and moan that GPU is a "lemon" and demand an RMA for that 1% of the time it might fail.

If there's an upside for Nvidia or the AIB partner to disclose stress testing data or however they're binning chips, I don't see it.
 
Last edited:
Reminds of Volkswagen, somehow....

Lies, lies and more lies, trying to find a justification for fooling you.

Nvidia, do you really need that attention ?
 
I think I got lucky with buying the MSI Armor RTX 2070 non-OC, it runs at a maximum temperature of 67C and I can barely hear the fans, and that's overclocked with nVidia Scanner. It boosts to about 1950 MHz, although HWinfo reads the max as 2010 MHz. That's almost as much as review samples of the 2070 Gaming Z. Just a heads-up that you might still get lucky while paying a lot less than for the Gaming Z.
 
But, there are no lies.

The higher binned silicon makes it into "OC" branded cards that are...yep...overclocked out of the box, while the regular binned stuff goes into spec cards. Guys, there is no conspiracy, no lies nor any tomfoolery and no, I'm no fan of Nvidia but this is just witch hunting. If we keep crying foul then we'll have no voice when it's true.

So, the only fair thing to say is that the days of overclocking are over and I'm not sure that it's something anyone owed us in the first place.
 
But, there are no lies.

The higher binned silicon makes it into "OC" branded cards that are...yep...overclocked out of the box, while the regular binned stuff goes into spec cards. Guys, there is no conspiracy, no lies nor any tomfoolery and no, I'm no fan of Nvidia but this is just witch hunting. If we keep crying foul then we'll have no voice when it's true.

So, the only fair thing to say is that the days of overclocking are over and I'm not sure that it's something anyone owed us in the first place.

Isn't this kinda like the days when you had GTX cards then a tiny notch above that you had Ultra ones? Anyway storm in a teacup
 
When Nvidia bins it themselves and gives it a different part number, I'd argue I should know which GPU part number I am buying.
Do you know which memory chips are on it? what about the vrm components? How about whether it was assembled in China or Taiwan? These variables and many more affect performance, and manufacturers know exactly how each permutation affects performance because they have to test every component they source so they can be sure they're getting what they need at the best cost. If they're using expensive Japanese capacitors over cheaper Chinese ones, it's because they have to. If they find a cheaper part that meets all their needs, they're going to switch it out and not say a damned thing about it to the customer unless and until it bites them in the ass.
 
I have no problem with them binning their parts.

If they do and they create different part numbers out of it, I think that should be printed on the box and in the specs so I know what I'm buying.

That's all I'm asking. To know what part number I am buying.

It’s pretty clear which is which, in my view. The lower-end chips are being used for cards that are clearly designed to target the discounted market. Enthusiasts aren’t going after MSI Armor, for example. It’s not a secret which cards are using these, and the pricing delta is a giveaway. Here in Canada, I’m seeing pricing differences upwards of $200 CAD for either version retail right now. At that point, it’s no secret there’s a difference. Any semi-educated tech consumer would take it upon themselves to check to see what’s justifying the price delta, and the one’s who don’t probably aren’t the enthusiast demographic anyway, and therefore won’t care because they’re not overclocking anyway.

From what I understand, Nvidia insisted on a version from board makers guaranteed to be priced at the launch MSRP, and this was the solution. I don’t see anything nefarious about that. With pricing differences being what they are right now, it’s obvious which is which. I’ll be upset if they decide to sell the inferior chips under the flagship brands (Gaming Z, Aorus, ROG, etc), which are higher priced and targeted at the enthusiast market. Overclocking is never guaranteed, and the lower priced chips are running at Nvidia’s specified clock speeds, so you’re getting exactly what you paid for.
 
You are buying a 2070 spec graphics card. It makes spec. You got what you bought. There are no different categories. It's a good thing you weren't around in the ATI Radeon 9500 Days for instance....because your head would have exploded.

My soft-modded Radeon 9500 to Radeon 9700 is still my favourite video card ever. Talk about unlocking value!
 
It feels good to have a 1080 ti and a 1440p monitor. I have no reason to participate in the most bizarre, inexplicably deceptive launch I've ever seen. Between the lack of performance data at product announcement, overreaching NDA's, launch delay, reportedly rapidly deteriorating cards and unjustifiably high prices, I couldn't be happier to sit this one out.

They should call the non-A cards RTX 2070B. As in you'd B an idiot to buy this over a 1080ti.
 
You are buying a 2070 spec graphics card. It makes spec. You got what you bought. There are no different categories. It's a good thing you weren't around in the ATI Radeon 9500 Days for instance....because your head would have exploded.
Reapeating the same thing over and over again doesn't help :p

You go buy a car. It's advertised as 300 hp, 250 ft-lb torque, 0-60 in 4.5 seconds, top speed of 185 mph, and a curb weight of 3100 lbs. You _really_ don't care if it's a V8 or blown I4 under the hood?

(FWIW, I'm behind your thinking -- it doesn't matter what chip is present as long as it meets 2070/2080/ti spec. But there is something to be said on expectation of product received for money paid)
 
  • Like
Reactions: Gavv
like this
Just a way for companies to not lose money on produced cards.

Card makes the high spec goes to premium pricing

Card doesn’t make the best spec may go to mid tier.

Card meets the barest spec goes to the low tier bin.

Saves the company from having to eat the cost. Granted it isn’t always clear to us but we all can decide which price point and performance tier we want.

Deceptive? Not really.

Many manufactures do the same thing. Up to us to figure it out. You want discount? Shop at Costco or Walmart? You want all the bells and whistles? You may have to go to a specialty shop or order online.

Not seeing to much an issue here. As much as I would like to bust Nvidias chops on it what it tells us is pretty much what SickBeast stated in post 3. Seems they have hit the proverbial wall. Or more literal.
 
Seems they have hit the proverbial wall. Or more literal.

I don't even really see this; they've been on 16/12nm for quite some time and are moving to "7nm". That'll last a while, for AMD too. Hell Intel might use it now that they're farming out to commercial fabs.

This 'A' part just seems like a more direct method of binning and it's more surprising that they haven't done it before than that they're doing it now.
 
  • Like
Reactions: Gavv
like this
I don't even really see this; they've been on 16/12nm for quite some time and are moving to "7nm". That'll last a while, for AMD too. Hell Intel might use it now that they're farming out to commercial fabs.

This 'A' part just seems like a more direct method of binning and it's more surprising that they haven't done it before than that they're doing it now.

Quite possible.
 
Marketing is a bitch.
Ony of my major complains of ALL is the laptop space.
More than any rebrand, all misleading products cause they're single products and not an entire market.

Ohh you have an 12 inch laptop, that means your I7 is an I3 desktop downclocked, you have an 13 inch then it's like a downclocked I5, you have an 17" then it's an I7 downclocked.
You have a laptop GTX whatever it's always way less and people think an 12Inch with GTX1050 and I7 is equal to an desktop with I7 and 1050 and it's absolutely miserable having to explain to people who needs workstations and doesn't understand computers..

This would obviously be the same for amd but they're not that large in the notebook segment yet.
 
Back
Top