8 or 12 gigs

Good grief, sensitive much? :rolleyes:

What I meant by people like you was simply people that think this is only marketing and not seeming to understand what the feasible options were for memory capacity on this card.

It's not about sensitivity. I come on here for mature tech discussions. If I wanted to deal with rudeness, I would have logged on to Twitter.

In any case, I think it's clear why Nvidia went with 12GB. I don't think they originally intended to do so given the disparity we're seeing between the memory buffers on this generation of cards, particularly because they already launched a 3060Ti in an effort to beat AMD to that segment of the market for this upgrade cycle. They needed a higher memory option to sell to customers who will hold up both cards and say "well this is 8GB, and this is 12GB, so why would I get the 8GB?". This makes up a significant portion of the tech market, and gives OEMs the opportunity to advertise "OMG 12GB OF VRAM" in their flyers. I have little doubt this is why Nvidia made that decision, and reducing the bus down to 192-bit is a convenient consequence of that because you had to give it a visual disadvantage vs the 3060Ti because those owners would question why their card has less VRAM but costs more. Plus, it also costs less.
 
There is the real truth. It's likely there was still time to change the card and give it a 12GB frame buffer given AMD's solutions have tons more VRAM, which is bad for NVIDIA from a marketing standpoint. Other cards like the 3080 were already being built as 10GB models before NVIDIA knew AMD's 6800XT would have 16GB of RAM, or NVIDIA would have gone with 20GB.
Bingo. I have no doubt this is what happened. Both Nvidia and AMD know that sales of computers and computer parts have been extremely robust during COVID. Being first to market during a time of heavy upgrading is critical to capture sales before the other guy has a chance to get in there, but the downside is you don't know what the other guy has in store. I never envisioned AMD dropping a 16GB VRAM buffer, and I doubt Nvidia did either.
 
Oh I have no doubt that will be the case. VRAM capacities in big letters on boxes sell cards...
The idiotic naming schemes makes me think that's all part of the plan. Try to be as confusing as possible meanwhile let the marketing department run wild...
 
Bingo. I have no doubt this is what happened. Both Nvidia and AMD know that sales of computers and computer parts have been extremely robust during COVID. Being first to market during a time of heavy upgrading is critical to capture sales before the other guy has a chance to get in there, but the downside is you don't know what the other guy has in store. I never envisioned AMD dropping a 16GB VRAM buffer, and I doubt Nvidia did either.
It was rumored for probably nearly half a year that the top Navi cards would have 16 gigs of vram. Pretty much every single outlet that had reputable sources was saying that so it was no secret.
 
There is the real truth. It's likely there was still time to change the card and give it a 12GB frame buffer given AMD's solutions have tons more VRAM, which is bad for NVIDIA from a marketing standpoint. Other cards like the 3080 were already being built as 10GB models before NVIDIA knew AMD's 6800XT would have 16GB of RAM, or NVIDIA would have gone with 20GB.
The best nvidia could have done at least with the current RTX 3080 PCB design is going with 12GB

It could have used the RTX3090 pcb, but then the cost would have increased a lot. There's a reason we still don't have a 20GB version yet.
 
It was rumored for probably nearly half a year that the top Navi cards would have 16 gigs of vram. Pretty much every single outlet that had reputable sources was saying that so it was no secret.
Yeah, but I think the fact that Big Navi challenged performance where it did in the Ampere stack took Nvidia by surprise. Had Big Navi fallen far shorter than it did, VRAM capacity on it wouldn't have mattered. As it is, challenging the 3080 and 3090 in raster while having 16GB made the 8GB 3070 and 10GB 3080 seem like Nvidia being miserly from a marketing perspective.
 
The best nvidia could have done at least with the current RTX 3080 PCB design is going with 12GB

It could have used the RTX3090 pcb, but then the cost would have increased a lot. There's a reason we still don't have a 20GB version yet.
Aren't they functionally the same PCB, the only difference being that the 3090 has VRAM chips on the back? If they start getting 2GB GDDR6X chips they can easily do a 20GB card with the existing PCB design only mounting on the front.

Personally I think lack of a 20GB version has more to do with how everyone and their mother is buying a 3090 when they get in stock. Why would Nvidia divert 102 dies to cheaper cards when that is happening.
 
"People like you". Wow, way to be a douche. If you're done stroking your inflated sense of superiority boner, I'm not saying this from a technical standpoint, I'm saying this from a marketing standpoint. No one told Nvidia they absolutely had to go with a 192-bit bus.
Exactly right. If the 6700xt wasnt coming with 12gb they would have been happy to release a 3060 with the same vram as a 3060ti or less.
 
Aren't they functionally the same PCB, the only difference being that the 3090 has VRAM chips on the back? If they start getting 2GB GDDR6X chips they can easily do a 20GB card with the existing PCB design only mounting on the front.
The RTX 3090 pcb is much more complex and much higher cost to produce.

Thing is 16gb chips from Micron haven't even been announced yet. Don't expect them earlier than 2nd half of this year at best.
 
Exactly right. If the 6700xt wasnt coming with 12gb they would have been happy to release a 3060 with the same vram as a 3060ti or less.
I think they wanted to release it as a 6GB card. But at this stage in the Ampere release cycle and the perception on VRAM capacity and in the wake of RDNA2, would have looked bad from a marketing standpoint. Especially with how hard they were going after 1060 owners with it. And the 1060 is also a 6GB card. So 3 gens of 6GB 60 class cards? Not a good look.
 
The RTX 3090 pcb is much more complex and much higher cost to produce.

Thing is 16gb chips from Micron haven't even been announced yet. Don't expect them earlier than 2nd half of this year at best.
Well gddr6x was never even announced before the 3080 came out and that's why many thought it was not even real.
 
and that logic actually makes sense for many people.

Heck even some review sites are claiming that the RTX3060 is faster than the Ti version. :(:rolleyes:
My brother always believe more memory is better crowd also. He hasn't been interested in computers since early 2000s. He just pieces togather junk computer that they throw away at work. He is flabbergasted when he see my setup.boutsodebthe computers I build for his kids none of his systems even have a SSD.
 
It's true. I used to see people buying the FX 5200 with 256MB at Comp USA all the time. The GPU was never fast enough to utilize that much RAM but it was cheap and had as much or more VRAM than cards that were actually faster and more expensive.

My first GPU I bought was a 8500LE 128MB, instead of IIRC a 9600 or something.

I, however, had the excuse of being 13 and having no smartphone at that time😅
 
It was rumored for probably nearly half a year that the top Navi cards would have 16 gigs of vram. Pretty much every single outlet that had reputable sources was saying that so it was no secret.

It depends where you are in the engineering cycle with regard to whether or not you're able to make design changes and still meet launch schedules. It's entirely possible Nvidia figured they advanced too far to make the changes to board design by the time they found out and pushed forward. I don't know, I wasn't there, but I find it quite unlikely they wouldn't have added VRAM knowing what was coming. I could be wrong, but I just find it unlikely they would have launched a flagship card with 10GB RAM and lower ones at 8GB if they knew AMD was coming out at 16/12GB.

Also, remember "wait for Vega"? Lots of things fill up the rumour mill.
 
The best nvidia could have done at least with the current RTX 3080 PCB design is going with 12GB

It could have used the RTX3090 pcb, but then the cost would have increased a lot. There's a reason we still don't have a 20GB version yet.

I think it's a combination of that and they'll have a really hard time launching the 3080Ti when they can't even keep stock of the current stack. Nvidia doesn't want to do another effective paper launch and further frustrate their fan base. Just look at their presentation for CES 2021, their comment section was literally just people saying "out of stock".
 
I think it's a combination of that and they'll have a really hard time launching the 3080Ti when they can't even keep stock of the current stack. Nvidia doesn't want to do another effective paper launch and further frustrate their fan base. Just look at their presentation for CES 2021, their comment section was literally just people saying "out of stock".
Actually I think yet another "paper launch" wouldn't made much of a difference. I mean, they already did the RTX3070, 3060Ti and 3060. Its like painting stripes on a tiger.

I've always sustained since the very first RTX3080Ti rumors that we would not see it until there were 16gb DDR6x chips available.
 
I think it's a combination of that and they'll have a really hard time launching the 3080Ti when they can't even keep stock of the current stack. Nvidia doesn't want to do another effective paper launch and further frustrate their fan base. Just look at their presentation for CES 2021, their comment section was literally just people saying "out of stock".

I'm sure Nvidia is really concerned about upset not-customers begging to be able to give them money while selling every card they can manufacture.

tenor.gif
 
Last edited:
Actually I think yet another "paper launch" wouldn't made much of a difference. I mean, they already did the RTX3070, 3060Ti and 3060. Its like painting stripes on a tiger.

I've always sustained since the very first RTX3080Ti rumors that we would not see it until there were 16gb DDR6x chips available.

It’s possible, but I’m pretty sure they don’t want any more bad press for no good reason. Why launch a card when you can’t produce it? Items bad marketing, and it’s really not necessarily in response to anything, because they already have some thing in the stack to respond to the best that AMD can bring to the table, so there is less of a rush at this point.
 
It's true. I used to see people buying the FX 5200 with 256MB at Comp USA all the time. The GPU was never fast enough to utilize that much RAM but it was cheap and had as much or more VRAM than cards that were actually faster and more expensive.
There were games at the time, and a few years after its release, that it could play modestly and utilize more than 128MB of VRAM, which was the default of the FX 5200.
Also, for static images and 3D rendering at the time, the low-cost of the FX 5200 paired with 256MB of VRAM was a boon to applications outside of 3D gaming.

For gaming-only, I agree with you.
For applications outside of gaming, not so much.

It was the same with with the GT 430 paired with 4GB VRAM from circa 2013 - no way that GPU could handle 4GB of 3D gaming, but for applications that could utilize large amounts of VRAM with small amounts of GPU processing, it was a win due to its low cost.
 
for applications that could utilize large amounts of VRAM with small amounts of GPU processing, it was a win due to its low cost.
I’ll keep saying it: games can use that VRAM and not tax the gpu. Loading the highest quality textures you can fit in your gpu memory does nothing to frame rate, and it elevates the presentation notably.

If we suddenly all got 16gb VRAM I guarantee devs would use way better textures and we’d all use that memory and enjoy better looking games. More VRAM is never a bad idea.
 
There were games at the time, and a few years after its release, that it could play modestly and utilize more than 128MB of VRAM, which was the default of the FX 5200.
Also, for static images and 3D rendering at the time, the low-cost of the FX 5200 paired with 256MB of VRAM was a boon to applications outside of 3D gaming.

For gaming-only, I agree with you.
For applications outside of gaming, not so much.

It was the same with with the GT 430 paired with 4GB VRAM from circa 2013 - no way that GPU could handle 4GB of 3D gaming, but for applications that could utilize large amounts of VRAM with small amounts of GPU processing, it was a win due to its low cost.

The vast majority of cards sold at places like Comp USA were for gaming, not other applications. That being said, I understand your point. But again the fact is people equate RAM with speed even though the two aren't as related as is generally believed.
 
The same people who buy a 3060 with 12GB of ram thinking its faster than a 3060Ti with 8GB of ram are also the kind of people who will likely never, ever look at benchmarks or second guess their purchase.

They also, likely, won't even realize that the cards are different at all.
 
Back
Top