Red Falcon
[H]F Junkie
- Joined
- May 7, 2007
- Messages
- 10,832
NVIDIA is capitalizing on theI don't see "built for content creators" on there.
It's ok to not see the bigger picture - just keep doing as you are told.

Last edited:
NVIDIA is capitalizing on theI don't see "built for content creators" on there.
It's amazing that this has to be explained over and over and over on a hardware forum like this but sadly it does.The RAM size is function of the bus width and chip density (currently, either 4Gbit or 8Gbit).
128bit = 4GB or 8GB -- 3050*
192bit = 6GB or 12GB -- 3060, 3050 TI*
256bit = 8GB or 16GB -- 3070, 3060 TI
320bit = 10GB or 20GB -- 3080, 3080 TI*
384bit = 12GB or 24GB -- 3090
They can't have a 3060 with a 256bit bus since that would put it too close to the 3060 TI and 3070; they can't give it only 6GB as that would force the theoretical 192-bit 3050 TI down to 3GB, which probably isn't marketable in today's mid-range segment.
*Rumored
I don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.It's amazing that this has to be explained over and over and over on a hardware forum like this but sadly it does.
Well there are plenty of people that do not seem to understand the tie in between bus width and the amount of capacity that can be used with it. And again it's not about using 12 gigs of vram but about needing more than 6 which is absolutely a necessity for a card of that level even in a couple of games right now never mind upcoming games. It's going to get pretty tiresome having to say that every time someone says that video card doesn't need 12 gigs. And Nvidia even flat out said that 6 gigs would not be enough on a card like that when red gaming tech reached out to themI don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.
Whoooosshhhhh.NVIDIA is capitalizing on thelow-informationgaming purchasers who think these GPUs are solely meant for gaming simply because the box says so.
It's ok to not see the bigger picture - just keep doing as you are told.![]()
My thoughts exactly.Whoooosshhhhh.
192 bit bus = 6 memory chips = 3/6/12 GB size3060 gets 12GB VRAM, but 3080 only gets 10GB?
Ummmmmm....
They scrapped those when they launched the Creator drivers for their consumer class cards and killed the Quadro product line.Isn't rtx for gaming nd nt professional work according to terms agreement?
I am recommending them to all the graduating drafting, animation, video editing, and cad prospects. A good laptop with a 3060 and they are set for university 12GB gets gobbled fast by most of those tools, but it’s about the best option out there currently unless you want to pay much much more.I don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.
THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.Digital foundry had even hit the vram limit on the 2060 in a couple of games when they tested 2 years ago.
Massive.THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.
Do you have a ballpark figure for "massive"? People said the same thing about eGPUs being limited to an x4 link but IIRC the hit was well under 30% when I went looking a couple of years ago.Massive.
Shadow RAM prevents the application from crashing, but even quad-channel DDR4 wouldn't be enough to keep up with the performance demanded of even a mid-range GPU.
Well remember the performance differences between DDR4 and GDDR 4 cards. Well now try to imagine the difference between GDDR6 and DDR4. Now add all the latency and overhead created by that spill over and it’s access times.THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.
The GTX 970 3.5GB+512MB GPU showcases exactly what happens, as the effect is basically the same:Do you have a ballpark figure for "massive"? People said the same thing about eGPUs being limited to an x4 link but IIRC the hit was well under 30% when I went looking a couple of years ago.
Wow. Had no idea that was that bad.The GTX 970 3.5GB+512MB GPU showcases exactly what happens, as the effect is basically the same:
View attachment 318543
It depends on the game and application as well.Wow. Had no idea that was that bad.
No offense to you or any other member on this forum, and trust me, I'm trying to be polite here, but I don't need an explanation of how a memory bus on a GPU works; I've used it many a time to explain why an RTX 3080 does better at 4K than an RX 6800 XT. I was simply pointing out a fact of how an RTX 3060 will have more memory than an RTX 3080. That doesn't mean said RTX 3060 will be faster... because it won't... ever.192 bit bus = 6 memory chips = 3/6/12 GB size
256 bit bus = 8 memory chips = 4/8/16 GB size
320 bit bus = 10 memory chips = 5/10/20 GB size
352 bit bus = 11 memory chips = 5.5/11/22 GB size
Memory size is more a function of the bandwidth needed to saturate the core and not the size of the frame buffer.
No no, more VRAM is always better, especially when it is on multiple NVIDIA GPUs - remember, the more (VRAM) you buy, the more you save!No offense to you or any other member on this forum, and trust me, I'm trying to be polite here, but I don't need an explanation of how a memory bus on a GPU works; I've used it many a time to explain why an RTX 3080 does better at 4K than an RX 6800 XT. I was simply pointing out a fact of how an RTX 3060 will have more memory than an RTX 3080. That doesn't mean said RTX 3060 will be faster... because it won't... ever.
Please move along and don't rehash this point again. Thanks.
Because at a certain point more VRAM is pointless if the card isn't fast enough to support it. They have a 3060 Ti with 8 GB and now a 3060 with 12 GB...Why?
Exactly.My GTX 1060 6GB graphic card can't max-out Doom Eternal at 1080p. it's not due to a lack of performance
(it runs ultra nightmare effects plus Ultra textures at 80 fps), but they set the hard VRAM limit I can't push above.
The fact that you can already find cases like this today tells you how fucking pointless it would be to sell you a graphics card with only 6GB VRAM at 2070 performance!
They have stopped the split memory bus of the GTX 660 days, so now it's 6 or 12.
Ugh...boring. I was hoping the one last thing would be a 3080ti.
They could just put the memory chips on both sides of the board like the 3090.That is held-up by the lack of GDDR6X 16 Gb chips.
Micron is the sole source for those parts!
Luckily, Nvidia has finally acknowledged that those higher-density GDDR6 chips exist! Maybe when they finally announce the 3080 Ti, they will also launch the 3070 Super (16GB + 16 Gbps speed would be awesome!)
They could just put the memory chips on both sides of the board like the 3090.
I haven't heard anything about 16Gb GDDR6X, yet. Adding another 10x 8Gb GDDR6X chips would only add around $100 to the price of materials according to info available.Why would NVIDIA pay the same price premium for a $1000 video card that they do for a $1500 one?
Those exotic chips are not cheap, and when you have nearly %85 of the VRAM chips of a 3090 (and you have to reuse it's bigger PCB, to fit 10 more chips), then you would expect to pay over $1200 for this smaller memory bus cut.
But it's much cheaper to double the memory capacity of those ten chips, and then reuse the same simplified PCB design for the 3080. Adding a better cut of that GA102 (+200) and upgrade those ten to double-density ram chips (+100) makes the $1000 price feasible.
Or you can stop being whiny, and pony-up the cash for a 3090 NOW! If $2000 buy-it-now is too rich for your blood, then what do you expect Scalpers Inc to charge when this new SKU launches?
I see $1800 EBAY3080 Ti in our future!
I don't expect a 3080ti FE to be no less then $1200. I can see AIB going for $1400+.Why would NVIDIA pay the same price premium for a $1000 video card that they do for a $1500 one?
Those exotic chips are not cheap, and when you have nearly %85 of the VRAM chips of a 3090 (and you have to reuse it's bigger PCB, to fit 10 more chips), then you would expect to pay over $1200 for this smaller memory bus cut.
But it's much cheaper to double the memory capacity of those ten chips, and then reuse the same simplified PCB design for the 3080. Adding a better cut of that GA102 (+200) and upgrade those ten to double-density ram chips (+100) makes the $1000 price feasible.
Or you can stop being whiny, and pony-up the cash for a 3090 NOW! If $2000 buy-it-now is too rich for your blood, then what do you expect Scalpers Inc to charge when this new SKU launches?
I see $1800 EBAY3080 Ti in our future!
I think that makes the most sense. 3080 stock has been so horrific and selling out instantly. Why release another card now to compete when you can keep milking the 80 until the fall and have the Ti be the next big thing for the holidays.Gamers Nexus is claiming a 3080TI was on the cards for CES but pulled because of the stock situation with current cards.
8 minute mark:
The bigger the base of 3080s, the more suckers to upgrade from that to the 3080ti when it comes out!Why release another card now to compete when you can keep milking the 80 until the fall and have the Ti be the next big thing for the holidays.