RTX 3060 12GB

I am not sure to follow (it is such strange semantic to wrap my mind around), but the 3080 lite is being sold has a 3080 and an less lite will be sold has a 3080TI.
Hopefully I can be more clear now:

1. 3080 vs 3090. Both use GA102 as their chip. The 3080 is a cut down version of that chip. Different shader core amount and memory, different GPU name as branding.
2. 3060 12GB and 3060 6GB. Both use GA106 as their chip. The 6GB is a cut down version of that chip. Different shader core number amount and memory, but same GPU name as branding.

Almost all GPU sold are a chip being cutted down to allow better yield no ?
Yes, and they are branded as different gpus: regular or Ti version. Meanwhile, the 3060 12/6GB will both be branded as if they were the same GPU. They are not.

I think they call that 3060 12 gb a 3060 super and the other one just 3060 and everything would make sense, the issue is the exact same name, not being put both on the same xx60 tier imo.
The super designation has been used only once in the past, and it was the refreshed lineup a year later. Super is not used to differentiate tiers within the same year's lineup. This is following what Nvidia has done so far. Likely, in 2021 we'll have Super variants of the 30 series.

If you don't see the differences, I don't know how else to put it any clearer.
 
1. 3080 vs 3090. Both use GA102 as their chip. The 3080 is a cut down version of that chip. Different shader core amount and memory, different GPU name as branding.
3080 vs 3090 has a different memory bus, that the important difference I think here (that why the 3080TI will be significantly different than a 3090 under some scenario's)

Redo that same exercise with the 3080TI and the 3080
Both use the GA102, 3080 is a more cut down version than the 3080TI will be a cut down version of that same chip, different GPU name as branding but exact same family xx80 branding

2. 3060 12GB and 3060 6GB. Both use GA106 as their chip. The 6GB is a cut down version of that chip. Different shader core number amount and memory, but same GPU name as branding.
I see the difference, that why I said a 3060 super vs a regular 3060 would have been the best way to go with it imo, that it is not sharing the xx60 family tier that is an issue, has they are both GA106 with cutted down version of that chip, like the 3080 is to the 3080TI and I imagine the 3070 will be to the 3070TI (and the many super or ko edition that will came out will probably have some of the same going on).
 
I see the difference, that why I said a 3060 super vs a regular 3060 would have been the best way to go with it imo, that it is not sharing the xx60 family tier that is an issue, has they are both GA106 with cutted down version of that chip, like the 3080 is to the 3080TI and I imagine the 3070 will be to the 3070TI (and the many super or ko edition that will came out will probably have some of the same going on).
Right, but that would break their naming convention.

You'd have

3090
3080 ti
3080
3070 ti
3070
3060 ti
3060 Super
3060
3050 ti
3050

That makes no sense. The fact that we'd have to contort to these extremes to explain what is happening is precisely what shows you that the base 3060 is not real, it's a 3050 ti that they're charging you more for by calling it something else. I'm not the one saying that, check the sources. Videocardz already thinks that too, and over the past few years it's become more and more clear that the 1060 3GB wasn't really a 1060, but a rebranded 1050 Ti. There's a notable history to take into account here.
 
Bring back the "Sucky Edition"!!
Oh wow, I totally forgot about that :D Sucky edition indeed! And actually, that helps explain my point:

1608067618992.png


Note how the SE version has less shaders. The other two 460 variants both have 336 cores, completely unchanged GPU, and memory speed. What changes is the memory bus, so LukeTbk does have a point here. Still, same shader core count, very similar performance, with the exception of the added bandwidth. That won't be the case between two 3060 variants with different shader cores, just like it wasn't between the 6 and 3GB 1060 variants. If anything, this goes to show that Nvidia changes the branding name when the card has a different shader core count, because that's the main performance motor. The fact that they're occluding this in the 3060 as they did in the 1060 is misinforming AF. I fell for it with the 1060 3GB, think how many people will make the same misinformed error this time around.
 
Yeah, the 460 SE was probably the worst example of consumer confusion. Uninformed buyers expected the 3 GB 1060 to run slower than the 1060 6 GB due to lower vram (even though actual reason was lower core count)

Back then though, you had a 460 SE 1 GB and a 460 768 MB for around the same price. "Surely the 1 GB Special Edition is faster!" Nope.

Had a 460 768 MB for F@H back in the day. Good times.
 
Last edited:
Yeah, the 460 SE was probably the worst example of consumer confusion. Uninformed buyers expected the 3 GB 1060 to run slower than the 1060 6 GB due to lower vram (even though actual reason was lower core count)
I'm slightly embarrassed to admit I was one of those uninformed buyers. Although technically I wasn't - you tell me it's a 1060 with less VRAM, I don't have to wait for reviews, I "know" what to expect and just use lower textures. Boy was that the wrong assumption. I've wised up ever since! That's why I created this thread, to a) shame Nvidia on this shady trend a bit and b) make it more likely other buyers will realize what they're actually considering to buy.
 
Some good news to end the year on: seems like Nvidia has backtracked on the stupid decision to repeat the mistake of the 1060 3GB:

1609425540200.png


So that means the non-ti 60 series will have 12GB VRAM, period, no shadily marketed weaker variants of the same "family" with no naming difference. The 3060 6GB is now revealed, as I said weeks ago, as what it is: a 3050 Ti, no more, no less. I wonder why the change in strategy at the very end - just a couple weeks before expected announcement. These things - late changes - happen all the time in the industry, but you have to wonder why. Perhaps AMD's equivalent is good enough that they can't market the 6GB as a 60 series? Personally I think this is the way Nvidia will justify a $299 price for the 3060, and halving the VRAM plus the weaker GPU will fall down to $199-230 for the 3050 Ti, with $149-175 for the 3050.

Either way, this is good news for consumers. It was clear from the get go that the 3060 6GB was deceptive marketing, not following the structure that virtually the rest of the lineup follows. As for me, the 3060 is back in consideration vs AMD's alternative. It'll be hard to choose AMD though, given both cards will have the same VRAM but Nvidia can do AI upscaling - while AMD still can't. Buy what the product can do now, not what is promised to do in the future.
 
Last edited:
I wouldn't be surprised if they jack up the prices too.
Nvidia is just one of those companies that GOT LUCKY with their engineers.

They got popular and powerful, but ...do they deserve it at this point in time?

When it comes down to it, ethics and honor MATTERS.
It certainly hinders profit in the short-term, but long-term success does in fact hinge on honor, integrity, and true intelligence.

Not cheap marketing cleverness, supply-and-demand manipulation, and hype.
Those provide a quick buck, but in the end, they will never deliver those long-term results that truly powerful innovators demand.
Doesn't seem to matter to consumers. Even people around here that buy their products say they don't like the company's practices but buy the products anyways.
 
Some good news to end the year on: seems like Nvidia has backtracked on the stupid decision to repeat the mistake of the 1060 3GB:

View attachment 314469

So that means the non-ti 60 series will have 12GB VRAM, period, no shadily marketed weaker variants of the same "family" with no naming difference. The 3060 6GB is now revealed, as I said weeks ago, as what it is: a 3050 Ti, no more, no less. I wonder why the change in strategy at the very end - just a couple weeks before expected announcement. These things - late changes - happen all the time in the industry, but you have to wonder why. Perhaps AMD's equivalent is good enough that they can't market the 6GB as a 60 series? Personally I think this is the way Nvidia will justify a $299 price for the 3060, and halving the VRAM plus the weaker GPU will fall down to $199-230 for the 3050 Ti, with $149-175 for the 3050.

Either way, this is good news for consumers. It was clear from the get go that the 3060 6GB was deceptive marketing, not following the structure that virtually the rest of the lineup follows. As for me, the 3060 is back in consideration vs AMD's alternative. It'll be hard to choose AMD though, given both cards will have the same VRAM but Nvidia can do AI upscaling - while AMD still can't. Buy what the product can do now, not what is promised to do in the future.

Looks like a solid lineup. The 3050ti would be great at the $200 price point.

I would be surprised if there wasn't an 8 GB 3050, especially considering it is RTX capable and there is a 12 GB 3060. That could be a solid alternative to those eyeing a Series S for light gaming.

A low profile version of that would be a godsend for those of us stuck with a 1050ti / 1650.

All said, glad they are being more consumer friendly.
 
Last edited:
This is quite the strange line up, when we start considering the fact that a 3060 is called a 3050TI instead is less deceptive marketing.

That table seem to have a typo in it (Memory clock line make little sense, not just the fact it does not show frequency but the number of the bandwidth has well seem counterintuitive)
 
This is quite the strange line up, when we start considering the fact that a 3060 is called a 3050TI
It was always a 3050 Ti. The cut down GA106 chip. I agree that memory clock seems weird - GDDR6 at 14gbps is more logical. The bus width was always expected to be 192bit though, I don't understand your confusion there. Also remember, these are unconfirmed specs. It's a rumor, not an official spec.
Looks like a solid lineup.
Agreed. Now that there's a regular and Ti version for each tier, and each GA chip falls where it's supposed to: regular chip as regular model, cut down chip as lower tier Ti brand. Prices should fall where they're expected too.

Only problem if you ask me is the 3070 and 3080 - the VRAM situation really makes it seem like Nvidia underestimated AMD. They should have gone with higher VRAM from the get go. Considering the need for VRAM in future games, this can be a marketed feature by spacing out the models:

*6GB 3050 (seems like they can't realistically go under 6GB for a raytracing enabled card)
*8GB 3050 Ti
*12GB 3060 (jump to 12GB because 10GB doesn't make sense with a 192bit bus)
14GB 3060 Ti
16GB 3070
*18GB 2070 Ti
20GB 3080
*22GB 3080 Ti
24GB 3090
 
Last edited:
It was always a 3050 Ti. The cut down GA106 chip. I agree that memory clock seems weird - GDDR6 at 14gbps is more logical. The bus width was always expected to be 192bit though, I don't understand your confusion there. Also remember, these are unconfirmed specs. It's a rumor, not an official spec.
My confusion is

3060TI: 14 gbits with a 256 bit bandwith
3060: 16 gbits with a 192 bit bandwith.

I do not understand a 3060TI with less and slower ram at the same time (while having a more expensive databus), seem like the 14-16 should be inverted.
 
Last edited:
My confusion is

3060TI: 14 gbits with a 256 bit bandwith
3060: 16 gbits with a 192 bit bandwith.

I do not understand a 3060TI with less slower ram (while having a more expensive databus), seem like the 14-16 should be inversed.
You are correct, it's probably a typo.
 
I would be interested in a 3050 for a plex compute HTPC. Wonder how long it'll take for those to be available.
 
This is unacceptable, it's false advertising
Incorrect. Substrings of the model number are irrelevant. The specifications, including the shader count, for the full part number are clearly provided. That's like saying the iPhone 12 Pro is false advertising because it has a smaller third camera sensor than the iPhone 12 Pro Max, because it's "understood" that the Pro line has an extra camera over the non-Pro line.
 
It was always a 3050 Ti. The cut down GA106 chip. I agree that memory clock seems weird - GDDR6 at 14gbps is more logical. The bus width was always expected to be 192bit though, I don't understand your confusion there. Also remember, these are unconfirmed specs. It's a rumor, not an official spec.

Agreed. Now that there's a regular and Ti version for each tier, and each GA chip falls where it's supposed to: regular chip as regular model, cut down chip as lower tier Ti brand. Prices should fall where they're expected too.

Only problem if you ask me is the 3070 and 3080 - the VRAM situation really makes it seem like Nvidia underestimated AMD. They should have gone with higher VRAM from the get go. Considering the need for VRAM in future games, this can be a marketed feature by spacing out the models:

*6GB 3050 (seems like they can't realistically go under 6GB for a raytracing enabled card)
*8GB 3050 Ti
*12GB 3060 (jump to 12GB because 10GB doesn't make sense with a 192bit bus)
14GB 3060 Ti
16GB 3070
*18GB 2070 Ti
20GB 3080
*22GB 3080 Ti
24GB 3090

As you pointed, there is only so many options they can do. For 192 bit bus, its either 12 GB (which some will say is too much and a waste) or it needs to be 6 GB (in which others will say is to low).

My guess is the 6 GB 3050ti will be the value champion and perfect for 1080p high at higher fps.

Same goes for the 128 bit 3050. 4 GB or 8 GB.
 
Incorrect. Substrings of the model number are irrelevant. The specifications, including the shader count, for the full part number are clearly provided. That's like saying the iPhone 12 Pro is false advertising because it has a smaller third camera sensor than the iPhone 12 Pro Max, because it's "understood" that the Pro line has an extra camera over the non-Pro line.

Not following the analogy. It would be like if an iPhone 12 pro 128 GB game with a worse camera than an iPhone 12 pro 256 GB or whatever.

Clearly provided? If it doesn't say it on the title, ioe GTX 1060 1152 core, then it is not very consumer friendly

Nvidia thought as much too and is fixing their mistake.
 
Incorrect. Substrings of the model number are irrelevant. The specifications, including the shader count, for the full part number are clearly provided. That's like saying the iPhone 12 Pro is false advertising because it has a smaller third camera sensor than the iPhone 12 Pro Max, because it's "understood" that the Pro line has an extra camera over the non-Pro line.
If you read the thread before opening your mouth, you’d realize that IPhone 12 Pro and IPhone 12 Pro Max are two different marketed brands. That is not the case with 2 cards called 3060 but with different VRAM amounts, that are hiding in their naming the fact that they have very different cuda core amounts, and therefore performance.

But go ahead, try to misrepresent what this thread is about and other users here, and other websites, have acknowledged. I plan on ignoring anything that is not a cogent argument. HNY!
 
In the interest of keeping this thread updated for potential readers: new information is coming out that it seems what would have been the 3060 6GB, the 106-300 chip, may have become the standard 3060 - whether that's a 6 or 12GB variant, we don't know. If Nvidia releases 2 106-300 models with different VRAM, at least it'll be a clearly branded purchasing decision unlike the 1060 fiasco.

I doubt the 106-400 becomes a variant like that rumored 3060 UItra, which seems clearly fake (check its supposed picture, and then today's leaked 3060 106-300 - they're the same picture, it's obvious the "Ultra" was a cheap photoshop, as suspected. We should find out about the 3060 in 3 days, expected spec:

1610204835708.png
 
Last edited:
I have been waiting for a "low cost" 3000, and now I'm so bummed out.
 
In the interest of keeping this thread updated for potential readers: new information is coming out that it seems what would have been the 3060 6GB, the 106-300 chip, may have become the standard 3060 - whether that's a 6 or 12GB variant, we don't know. If Nvidia releases 2 106-300 models with different VRAM, at least it'll be a clearly branded purchasing decision unlike the 1060 fiasco.

I doubt the 106-400 becomes a variant like that rumored 3060 UItra, which seems clearly fake (check its supposed picture, and then today's leaked 3060 106-300 - they're the same picture, it's obvious the "Ultra" was a cheap photoshop, as suspected. We should find out about the 3060 in 3 days, expected spec:

View attachment 317401

Yeah that made zero sense for the ultra. Sure it had more bram, but why would they position it higher than the 3060ti when it had less bandwidth and shaders??
 
I have been waiting for a "low cost" 3000, and now I'm so bummed out.
I doubt you’ll find a 3060 for less than $299. I think the 3050 Ti will become $230 and the 3050 $175 or something like that. Basically every tier has gone up in price. Can’t wait for Intel to get into the gpu game and start breaking everything (if they can build a capable gpu that is...)
 
Last edited:
I doubt you’ll find a 3060 for less than $299. I think the 3050 Ti will become $230 and the 3050 $175 or something like that. Basically every ties has gone up in price. Can’t wait for Intel to get into the gpu game and start breaking everything (if they can build a capable gpu that is...)
$299 for a 3060 might be just right for me. Or maybe that $175 3050. I'm not a gamer, just a Lightroom and Photoshop user so I need a decent GPU. But money not spent on that new vid card is money that's available for an AIO cooler to replace the stock AMD cooler. Or a hundred other items.
 
There we go:

1610472074534.png


3060 12GB for $329. That is... acceptable, I guess. Specs from Videocardz:

1610472124334.png


I for one I'm super happy to not see Jensen's face this time around. Sick of seeing his smugness on stage. This Jeff Fisher dude presenting is a lot more palatable.
 
Seems like the "3060 6 GB" will just be the 3050 Ti.
I wouldn't be so sure yet. A $329 price leaves the door open for a $279 6GB version. At least, because Nvidia went with the 106-300 chip, they will both have the exact same shader count, making memory the only differentiator (unconfirmed, but likely). The latter wouldn't be a bad value proposition per se for a lot of buyers, leaving space for a $229 3050 Ti and a sub 200 3050.

Looking forward to the reviews of the 12GB model.
 
In before "why is this 12 GB and 3060ti only 8 GB" youtuber-esque comment.

For real though, it looks to be a good price and hope they come out with a 6 GB version for around $50 less.
 
In before "why is this 12 GB and 3060ti only 8 GB" youtuber-esque comment.
For when that inevitable comments shows up again, here's an excellent explainer from Anandtech in 2011 of why bus width sort of dictates capacity (doesn't really, but for good performance/efficiency it basically does):

https://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2

I've updated the thread title too, since the 6/12GB debacle seems to have been solved in the end. There's been much discussion about this card on this thread anyway, re: 6GB or 12GB models and shader count, so I think it's worth it to keep it going.
 
This card has apparently a 170w TDP. Even these “dinky” coolers will probably be more than sufficient.

The page you linked also mentions that the 3060 supports Resizable BAR with some of EVGA’s Z490 and Z590 motherboards. Looks like the support is getting close.
The 2060 was 160W and they made full sized models. Plus way more to choose from.

https://www.evga.com/articles/01294/evga-geforce-rtx-2060/

The rest of the 3060 lineup by comparison lol

QgSw0uB.png
 
For when that inevitable comments shows up again, here's an excellent explainer from Anandtech in 2011 of why bus width sort of dictates capacity (doesn't really, but for good performance/efficiency it basically does):

https://www.anandtech.com/show/4221/nvidias-gtx-550-ti-coming-up-short-at-150/2

I've updated the thread title too, since the 6/12GB debacle seems to have been solved in the end. There's been much discussion about this card on this thread anyway, re: 6GB or 12GB models and shader count, so I think it's worth it to keep it going.

Weird hybrid design for the 192 bit 560ti getting 1 GB pool instead of 768 MB by using a few 2 Gb chips instead if 1 Gb.

So why not use a few 16 Gb chips along side mostly cheaper 8 Gb chips to get the same 8 GB on the 3060?

Basically it would be the same setup as the 560ti but everything multiplied by 8 on the chips sizes.
 
Basically it would be the same setup as the 560ti but everything multiplied by 8 on the chips sizes.
Speaking from ignorance, but if I had to guess, the things they could get away with back then no longer apply. The problem with the memory chips, as I've understood it, is that it's too difficult to have different sizes sync memory traffic properly since they're managing data in a non-expected way. I'd figure this kind of messing around with different memory controllers would mess with raytracing performance: it still requires more performance, cores and memory than we're currently getting, so messing with memory patterns is probably harmful to performance. Total guess, but considering the data requirements of RT, it makes sense.
 
Makes sense, I'm just curious how MS got away with it on the new consoles. Perhaps discreet GPUs require more precise synchronization.
 
Makes sense, I'm just curious how MS got away with it on the new consoles. Perhaps discreet GPUs require more precise synchronization.
Guessing again, but from what I understand the XSX has 16gb, of which 14 are available to game devs (2gb for OS). Those 14 won't be all for graphics, it'll be distributed among all the game's memory needs. Do we know what if the XSX is 192 or 256bit memory? All I know from Digital Foundry is it works:

1610480410893.png


So, perhaps those slower 6GB are used for other tasks, while devs will use the faster 10GB for graphics - making the bandwidth an expected value. Again, I have no idea, I'm simply speculating.
 
Back
Top