Nvidia Geforce Event

I don't see "built for content creators" on there.
NVIDIA is capitalizing on the low-information gaming purchasers who think these GPUs are solely meant for gaming simply because the box says so.
It's ok to not see the bigger picture - just keep doing as you are told. 🤖
 
Last edited:
The RAM size is function of the bus width and chip density (currently, either 4Gbit or 8Gbit).

128bit = 4GB or 8GB -- 3050*
192bit = 6GB or 12GB -- 3060, 3050 TI*
256bit = 8GB or 16GB -- 3070, 3060 TI
320bit = 10GB or 20GB -- 3080, 3080 TI*
384bit = 12GB or 24GB -- 3090


They can't have a 3060 with a 256bit bus since that would put it too close to the 3060 TI and 3070; they can't give it only 6GB as that would force the theoretical 192-bit 3050 TI down to 3GB, which probably isn't marketable in today's mid-range segment.

*Rumored
 
The RAM size is function of the bus width and chip density (currently, either 4Gbit or 8Gbit).

128bit = 4GB or 8GB -- 3050*
192bit = 6GB or 12GB -- 3060, 3050 TI*
256bit = 8GB or 16GB -- 3070, 3060 TI
320bit = 10GB or 20GB -- 3080, 3080 TI*
384bit = 12GB or 24GB -- 3090


They can't have a 3060 with a 256bit bus since that would put it too close to the 3060 TI and 3070; they can't give it only 6GB as that would force the theoretical 192-bit 3050 TI down to 3GB, which probably isn't marketable in today's mid-range segment.

*Rumored
It's amazing that this has to be explained over and over and over on a hardware forum like this but sadly it does.
 
It's amazing that this has to be explained over and over and over on a hardware forum like this but sadly it does.
I don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.
 
I have to wonder...just how much worse the shortages will get with these new chips now in the mix.
 
I don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.
Well there are plenty of people that do not seem to understand the tie in between bus width and the amount of capacity that can be used with it. And again it's not about using 12 gigs of vram but about needing more than 6 which is absolutely a necessity for a card of that level even in a couple of games right now never mind upcoming games. It's going to get pretty tiresome having to say that every time someone says that video card doesn't need 12 gigs. And Nvidia even flat out said that 6 gigs would not be enough on a card like that when red gaming tech reached out to them
 
NVIDIA is capitalizing on the low-information gaming purchasers who think these GPUs are solely meant for gaming simply because the box says so.
It's ok to not see the bigger picture - just keep doing as you are told. 🤖
Whoooosshhhhh.
 
3060 gets 12GB VRAM, but 3080 only gets 10GB?

Ummmmmm....
192 bit bus = 6 memory chips = 3/6/12 GB size
256 bit bus = 8 memory chips = 4/8/16 GB size
320 bit bus = 10 memory chips = 5/10/20 GB size
352 bit bus = 11 memory chips = 5.5/11/22 GB size

Memory size is more a function of the bandwidth needed to saturate the core and not the size of the frame buffer.
 
Isn't rtx for gaming nd nt professional work according to terms agreement?
They scrapped those when they launched the Creator drivers for their consumer class cards and killed the Quadro product line.
Edit:
Their consumer cards now also no longer give warnings when installing on Windows Server. Just stuck a stack of 2060’s in a lab server I am repurposing for a Digital Media course one of the Teachers here wants to run next semester.
 
Last edited:
I don't think people are confused by the bit- to capacity- logic, but why they would put 12GB of memory on a 3060 when there are very few situations where that much VRAM would be needed on a card of this performance level. Content creation is brought up, but I don't think people who do a lot of video encoding are going to be looking at this card. I could be wrong.
I am recommending them to all the graduating drafting, animation, video editing, and cad prospects. A good laptop with a 3060 and they are set for university 12GB gets gobbled fast by most of those tools, but it’s about the best option out there currently unless you want to pay much much more.
 
Digital foundry had even hit the vram limit on the 2060 in a couple of games when they tested 2 years ago.
THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.
 
THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.
Massive.
Shadow RAM prevents the application from crashing, but even quad-channel DDR4 wouldn't be enough to keep up with the performance demanded of even a mid-range GPU.
 
Massive.
Shadow RAM prevents the application from crashing, but even quad-channel DDR4 wouldn't be enough to keep up with the performance demanded of even a mid-range GPU.
Do you have a ballpark figure for "massive"? People said the same thing about eGPUs being limited to an x4 link but IIRC the hit was well under 30% when I went looking a couple of years ago.
 
THe way I understand it, cards for years now have spilled into shared system RAM when they run out of VRAM. I wonder how much of a hit that is.
Well remember the performance differences between DDR4 and GDDR 4 cards. Well now try to imagine the difference between GDDR6 and DDR4. Now add all the latency and overhead created by that spill over and it’s access times.
 
Do you have a ballpark figure for "massive"? People said the same thing about eGPUs being limited to an x4 link but IIRC the hit was well under 30% when I went looking a couple of years ago.
The GTX 970 3.5GB+512MB GPU showcases exactly what happens, as the effect is basically the same:
9U4v.gif
(click image to play)
 
Wow. Had no idea that was that bad.
It depends on the game and application as well.
If the game is running with low enough settings, sometimes there won't be a noticeable performance hit.

For any AAA game in the last decade with above-medium settings of any kind, it will definitely be immediately noticeable, haha.
 
192 bit bus = 6 memory chips = 3/6/12 GB size
256 bit bus = 8 memory chips = 4/8/16 GB size
320 bit bus = 10 memory chips = 5/10/20 GB size
352 bit bus = 11 memory chips = 5.5/11/22 GB size

Memory size is more a function of the bandwidth needed to saturate the core and not the size of the frame buffer.
No offense to you or any other member on this forum, and trust me, I'm trying to be polite here, but I don't need an explanation of how a memory bus on a GPU works; I've used it many a time to explain why an RTX 3080 does better at 4K than an RX 6800 XT. I was simply pointing out a fact of how an RTX 3060 will have more memory than an RTX 3080. That doesn't mean said RTX 3060 will be faster... because it won't... ever.

Please move along and don't rehash this point again. Thanks.
 
No offense to you or any other member on this forum, and trust me, I'm trying to be polite here, but I don't need an explanation of how a memory bus on a GPU works; I've used it many a time to explain why an RTX 3080 does better at 4K than an RX 6800 XT. I was simply pointing out a fact of how an RTX 3060 will have more memory than an RTX 3080. That doesn't mean said RTX 3060 will be faster... because it won't... ever.

Please move along and don't rehash this point again. Thanks.
No no, more VRAM is always better, especially when it is on multiple NVIDIA GPUs - remember, the more (VRAM) you buy, the more you save! :D
 
Because at a certain point more VRAM is pointless if the card isn't fast enough to support it. They have a 3060 Ti with 8 GB and now a 3060 with 12 GB...Why?


My GTX 1060 6GB graphic card can't max-out Doom Eternal at 1080p. it's not due to a lack of performance
(it runs ultra nightmare effects plus Ultra textures at 80 fps), but they set the hard VRAM limit I can't push above.

The fact that you can already find cases like this today tells you how fucking pointless it would be to sell you a graphics card with only 6GB VRAM at 2070 performance!

They have stopped the split memory bus of the GTX 660 days, so now it's 6 or 12; they need to start the 2GB chips SOMETIME, so why not now?
 
Last edited:
My GTX 1060 6GB graphic card can't max-out Doom Eternal at 1080p. it's not due to a lack of performance
(it runs ultra nightmare effects plus Ultra textures at 80 fps), but they set the hard VRAM limit I can't push above.

The fact that you can already find cases like this today tells you how fucking pointless it would be to sell you a graphics card with only 6GB VRAM at 2070 performance!

They have stopped the split memory bus of the GTX 660 days, so now it's 6 or 12.
Exactly.
This is the same thing that holds my GTX 980 Ti back (when actually paired with a decent CPU) at 1080p - the GPU is more than powerful enough, but 6GB VRAM simply is not enough to push all of the textures at higher settings.

Heck, it ran Fallout 4 at max settings at 2K easily, but with the HD texture pack installed it just killed the performance since 6GB VRAM was simply not enough.
 
Last edited:
Ugh...boring. I was hoping the one last thing would be a 3080ti.


That is held-up by the lack of GDDR6X 16 Gb chips.

Micron is the sole source for those parts!

Luckily, Nvidia has finally acknowledged that those higher-density GDDR6 chips exist! Maybe when they finally announce the 3080 Ti, they will also launch the 3070 Super (16GB + 16 Gbps speed would be awesome!)
 
Last edited:
That is held-up by the lack of GDDR6X 16 Gb chips.

Micron is the sole source for those parts!

Luckily, Nvidia has finally acknowledged that those higher-density GDDR6 chips exist! Maybe when they finally announce the 3080 Ti, they will also launch the 3070 Super (16GB + 16 Gbps speed would be awesome!)
They could just put the memory chips on both sides of the board like the 3090.
 
I was hoping for a 3080 Ti announcement yesterday...looks like end of March at the earliest
 
They could just put the memory chips on both sides of the board like the 3090.

Why would NVIDIA pay the same price premium for a $1000 video card that they do for a $1500 one?

Those exotic chips are not cheap, and when you have nearly %85 of the VRAM chips of a 3090 (and you have to reuse it's bigger PCB, to fit 10 more chips), then you would expect to pay over $1200 for this smaller memory bus cut.

But it's much cheaper to double the memory capacity of those ten chips, and then reuse the same simplified PCB design for the 3080. Adding a better cut of that GA102 (+200) and upgrade those ten to double-density ram chips (+100) makes the $1000 price feasible.

Or you can stop being whiny, and pony-up the cash for a 3090 NOW! If $2000 buy-it-now is too rich for your blood, then what do you expect Scalpers, Inc to charge when this new SKU launches?

I see $1800 EBAY3080 Ti in our future!
 
Last edited:
Why would NVIDIA pay the same price premium for a $1000 video card that they do for a $1500 one?

Those exotic chips are not cheap, and when you have nearly %85 of the VRAM chips of a 3090 (and you have to reuse it's bigger PCB, to fit 10 more chips), then you would expect to pay over $1200 for this smaller memory bus cut.

But it's much cheaper to double the memory capacity of those ten chips, and then reuse the same simplified PCB design for the 3080. Adding a better cut of that GA102 (+200) and upgrade those ten to double-density ram chips (+100) makes the $1000 price feasible.

Or you can stop being whiny, and pony-up the cash for a 3090 NOW! If $2000 buy-it-now is too rich for your blood, then what do you expect Scalpers Inc to charge when this new SKU launches?

I see $1800 EBAY3080 Ti in our future!
I haven't heard anything about 16Gb GDDR6X, yet. Adding another 10x 8Gb GDDR6X chips would only add around $100 to the price of materials according to info available.
 
Why would NVIDIA pay the same price premium for a $1000 video card that they do for a $1500 one?

Those exotic chips are not cheap, and when you have nearly %85 of the VRAM chips of a 3090 (and you have to reuse it's bigger PCB, to fit 10 more chips), then you would expect to pay over $1200 for this smaller memory bus cut.

But it's much cheaper to double the memory capacity of those ten chips, and then reuse the same simplified PCB design for the 3080. Adding a better cut of that GA102 (+200) and upgrade those ten to double-density ram chips (+100) makes the $1000 price feasible.

Or you can stop being whiny, and pony-up the cash for a 3090 NOW! If $2000 buy-it-now is too rich for your blood, then what do you expect Scalpers Inc to charge when this new SKU launches?

I see $1800 EBAY3080 Ti in our future!
I don't expect a 3080ti FE to be no less then $1200. I can see AIB going for $1400+.
 
Gamers Nexus is claiming a 3080TI was on the cards for CES but pulled because of the stock situation with current cards.

8 minute mark:

 
Gamers Nexus is claiming a 3080TI was on the cards for CES but pulled because of the stock situation with current cards.

8 minute mark:
I think that makes the most sense. 3080 stock has been so horrific and selling out instantly. Why release another card now to compete when you can keep milking the 80 until the fall and have the Ti be the next big thing for the holidays.
Smart play by NVIDIA, let me go scoop up some more shares.... 😂😂😂
 
Why release another card now to compete when you can keep milking the 80 until the fall and have the Ti be the next big thing for the holidays.
The bigger the base of 3080s, the more suckers to upgrade from that to the 3080ti when it comes out!
 
Back
Top