VideoCardz: Micron confirms NVIDIA GeForce RTX 3090 gets 21Gbps GDDR6X memory

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
Micron document inadvertently confirms 3090 Product Name, and memory config (12 GB GDDR6X, 384 bit bus), while shooting down claims that it would be 20GB-24GB. This table was cropped out of the Micron doc:

https://videocardz.com/newz/micron-confirms-nvidia-geforce-rtx-3090-gets-21gbps-gddr6x-memory

NVIDIA-GeForce-RTX-3090-Memory-Specifications-1.png

Edit. As pointed out. This Does NOT preclude a 24GB card.

Given everything we have seen, there likely will be 24GB cards. I start to wonder if there will be 12GB cards as well...
 
Last edited:
Eh...I wouldn't say it confirms that 24GB isn't happening. Look at column 3 where the Titan RTX and RX5700XT are listed with 12GB underneath. Neither has that configuration.

At most it confirms the RTX3090 and GDDR6X.
 
Eh...I wouldn't say it confirms that 24GB isn't happening. Look at column 3 where the Titan RTX and RX5700XT are listed with 12GB underneath. Neither has that configuration.

At most it confirms the RTX3090 and GDDR6X.

Fair enough. Probably confirms 384 bit bus. I still think 3090 is the gaming card and will have 12GB, 24GB will be for workstation cards with MUCH fatter margins.
 
Hopefully the 3090 will fit into my NZXT H210i, at least some variant of it :(

$1300 or less and I’m in as long as the performance is a big leap over the 2080Ti
 
3090 certainly means a new price tier is coming. $1500+ is my guess.

Also, props to twitter leakers KatCorgi and kopite7kimi who have been talking about these memory speeds for a long time.
 
Hopefully the 3090 will fit into my NZXT H210i, at least some variant of it :(

$1300 or less and I’m in as long as the performance is a big leap over the 2080Ti
What percentage are you considering a big leap? Me I'm in the market for +20-25% over the 2080ti on either the 3080 or above market.
 
Fair enough. Probably confirms 384 bit bus. I still think 3090 is the gaming card and will have 12GB, 24GB will be for workstation cards with MUCH fatter margins.

Perhaps. But on the other hand, the top xx80tis have been stuck at 11 GB for 4+ years. I think adding another 1 GB is just too small an increase at this time and 16GB doesn't fit the bus. Something >20 GB on 320-384 bits sounds more likely.
 
What percentage are you considering a big leap? Me I'm in the market for +20-25% over the 2080ti on either the 3080 or above market.

Realistically, hoping for 30%+ which I think isn’t toooo far fetched.
 
  • Like
Reactions: Mylex
like this
If they can give me a near-universal 25% boost and HDMI 2.1 support, I'll bite. Any less and I'll really have to think about this OR charge a lot more money for my 2080Ti.
 
Realistically, hoping for 30%+ which I think isn’t toooo far fetched.

same die size combined with 12 to 7nm shrink gives room for +50% shading and +300% Raytracing (assuming 20xx were 50/50 split). To cram the RTX on though 20xx dies were enormous; so I'm leaning towards something a bit smaller this time around and somewhat smaller gains. OTOH there should still be enough to give decent shader gains and enough ray tracing for it to be usable at the same resolutions/refresh rates that the shaders can do.
 
12GB vs 20~24 would keep the price from going to an absurd amount, but I can't see it being quite enough for a future PS5 or XBSX developed title being ported to PC.
 
same die size combined with 12 to 7nm shrink gives room for +50% shading and +300% Raytracing (assuming 20xx were 50/50 split). To cram the RTX on though 20xx dies were enormous; so I'm leaning towards something a bit smaller this time around and somewhat smaller gains. OTOH there should still be enough to give decent shader gains and enough ray tracing for it to be usable at the same resolutions/refresh rates that the shaders can do.
But what if they are only on Samsungs 8nm which is a glorifed 10nm?
 
Perhaps. But on the other hand, the top xx80tis have been stuck at 11 GB for 4+ years. I think adding another 1 GB is just too small an increase at this time and 16GB doesn't fit the bus. Something >20 GB on 320-384 bits sounds more likely.

Sure if you don't mind paying $2000+ for it. This is NVidia after all. 24GB Titan which was basically the same card at the 2080Ti, was $3000.

Does anyone think NVidia is going to sell a high end 24GB card for under $2000?
 
Sure if you don't mind paying $2000+ for it. This is NVidia after all. 24GB Titan which was basically the same card at the 2080Ti, was $3000.

Does anyone think NVidia is going to sell a high end 24GB card for under $2000?

I think even for NVIDIA going over $2,000 for 3080 Ti or 3090 (whatever it is called) is bit excessive, especially with AMD response few months later. I am ready to go up to $1,500 but if they price it significantly above that I would feel like a big sucker for buying it (given the ~50% gain rumor over 2080 Ti).
 
Sure if you don't mind paying $2000+ for it. This is NVidia after all. 24GB Titan which was basically the same card at the 2080Ti, was $3000.

Does anyone think NVidia is going to sell a high end 24GB card for under $2000?

Hah, I absolutely do mind if this thing is over $1200, but this is Nvidia (but also, tech does get relatively cheaper over time). We've all been reading the many threads on this; if AMD can't bring out a competitive Big Navi/RDNA2 around the same time as these cards, I wouldn't put it past Mr. Buy More-Save More to say it's a steal at $1999.
 
I think even for NVIDIA going over $2,000 for 3080 Ti or 3090 (whatever it is called) is bit excessive, especially with AMD response few months later. I am ready to go up to $1,500 but if they price it significantly above that I would feel like a big sucker for buying it (given the ~50% gain rumor over 2080 Ti).

Not over. A steal at $1999 for a powerful 24GB GPU, when last year that cost you $2999. :D
 
Not over. A steal at $1999 for a powerful 24GB GPU, when last year that cost you $2999. :D

True, but if all I am doing is gaming and do not have professional use for all the extra memory, then it is not as good value (for me) as I can do go down to 3080 and pay a lot less for half the memory. Unless of course the next gen games memory requirements skyrocket for gaming @4K? Not sure if there is a need for having north of 20GB in memory for next gen games - would be curious if anyone more educated on this subject has any thoughts on potential memory requirements in next gen games?
 
True, but if all I am doing is gaming and do not have professional use for all the extra memory, then it is not as good value (for me) as I can do go down to 3080 and pay a lot less for half the memory. Unless of course the next gen games memory requirements skyrocket for gaming @4K? Not sure if there is a need for having north of 20GB in memory for next gen games - would be curious if anyone more educated on this subject has any thoughts on potential memory requirements in next gen games?

PS5 is 16GB shared, same with XBSX. According to Reddit, XBSX has up to 13.5GB available for GPU but often less. But even conservatively, doesn't sound like 12GB would be enough to be optimal, and 8GB might suck on a new gen port to PC.

Also, PC ports are not as optimized as console games and settings on PC can be set much higher. That results in more VRAM usage in the same game on PC versus console. Worst case scenario, it is being speculated new gen ports may not being designed assuming PC minimum is an SSD. If they port to PC and assume lowest common dominator is a mechanical drive, VRAM usage could go much higher due to duplication of textures, in my understanding.

EDIT: I should add running out of VRAM or having a sub optimal amount doesn't always result in a huge performance drop though. This likely differs from one game to the next due to differences in game engine, code, dev designs/decisions, etc.
 
Last edited by a moderator:
PS5 is 16GB shared, same with XBSX. According to Reddit, XBSX has up to 13.5GB available for GPU but often less. But even conservatively, doesn't sound like 12GB would be enough to be optimal, and 8GB might suck on a new gen port to PC.

What I read was that XBSX has 13.5GB of RAM reserved for gaming, not for the GPU.
 
Fair enough. Probably confirms 384 bit bus. I still think 3090 is the gaming card and will have 12GB, 24GB will be for workstation cards with MUCH fatter margins.
It also confirms a 384bit bus for the 5700XT, which we know isn't true.
 
It also confirms a 384bit bus for the 5700XT, which we know isn't true.

No it doesn't. The chart says examples of GDDR6 applications are TItan RTX and 5700XT and example GDDR6 bus width is 384.

It doesn't say 5700XT has a 384-bit bus.

It doesn't confirm 12GB on the 3090 either for the same reason.
 
Seriously, that's the closest to launch a new memory technology has been announced in the entire history of video cards. Even enhancements like GDDR5X were announced 5 months prior to use by the GTX 1080.

I guess we will never see another Nvidia 512-bit graphics card? When they can't muster-up any more memory compression miracles, they would rather pay to hack a solution built just for them :D

Any idea if this is an exclusive to Nvidia, or will AMD be including support on RDNA2?
 
Last edited:
https://www.tomshardware.com/amp/news/nvidia-geforce-rtx-3090-ga102-everything-we-know

no one knows for sure what it (the price) will be right now. Here's my argument for why I think it's going to set a new record for a GeForce card. I'd really love to be proven wrong, but I'm cynical about GPU prices these days after seeing the RTX 2080 Ti never really sell at $1,000 in any reasonable quantities, even two years after the launch.

First, the RTX 2080 Ti is a $1,200 part, and RTX 3090 is reintroducing the -90 model number. The last time we had such a part was the GTX 690 back in 2012, and it was priced at $1,000, exactly double the price of the GTX 680.
 
All I can say is careful what you wish for folks. As cool as 24gb sounds.... it will be useless in 99% of what you run as a gamer, until the 3000 series is outclassed by a couple generations of hardware. However Nvidia will be charging a very nice (for them) 24GB premium.
 
All I can say is careful what you wish for folks. As cool as 24gb sounds.... it will be useless in 99% of what you run as a gamer, until the 3000 series is outclassed by a couple generations of hardware. However Nvidia will be charging a very nice (for them) 24GB premium.

I can't imagine the 24GB card will be any less than $1600, and really expect $1800+.
 
I am wondering why Nvidia is pushing that big of a change when it comes to memory config. Because they never really push memory that high. I was expecting slight bump. Also looks like 3090 is going to have 3 connectors for power. So Nvidia probably pushed the cards hard and might be putting shit load of ram on it to move the sales and showing it as an advantage this time. Both of this makes me think we might not see crazy lift, probably in line with 40% percent with regular raster performance, although probably a larger jump in ray tracing here I guess compared to first gen. Or I am thinking they are using more power now because they probably know Navi's ballpark number and trying to stay little ahead of AMD cards with more memory and little more performance.
 
24 GB might be overkill but 12 GB on flagship card in this day and age would be pathetic then again without any competition at the high end I don't really expect any better from Nvidia.

If it is 12 GB I don't think this can be seen as anything other than another stopgap generation which is sad considering that's all the last generation was other than having RT capabilities it couldn't really use slapped on(along with another few hundred dollars on the price tag). The less said about AMD right now the better though I do hope they manage to pull their head out of their ass like the CPU division finally managed to because we desperately need the competition.

</Grumpy gamer mode>
 
24 GB might be overkill but 12 GB on flagship card in this day and age would be pathetic then again without any competition at the high end I don't really expect any better from Nvidia.

If it is 12 GB I don't think this can be seen as anything other than another stopgap generation which is sad considering that's all the last generation was other than having RT capabilities it couldn't really use slapped on(along with another few hundred dollars on the price tag). The less said about AMD right now the better though I do hope they manage to pull their head out of their ass like the CPU division finally managed to because we desperately need the competition.
i
</Grumpy gamer mode>


But the cost of implementing that new communication standard is expensive, so the 12-gb limit ,makes sense.

Much like the added cost of GDDR6 was so much higher, the PS5 had to settle for a mere doubling of capacity after eight years, the added cost of going Analog communications on GDDR6X is going to demand we stay fixed on capacity for this generation.

I was hoping that NVIDIA had a bit more compression magic to pull out of it's hat, but the massive bandwidth increase oi A100 says otherwise - we will need to wait for a second -generation with mass-production before we wile see a doubling of capacity

If you need to develop a new memory standard every time you develop a new architecture, you have to redirect SOME memory density costs into that standard.

It doesn't mean we're standing-still, just that now a good expectation for a doubling of VRAM capacity would be 6-8 years. so maybe Ampere's replacement?
 
Last edited:
The only individuals who need this card are dudes like me in 3d rendering. I can literally eat through 64 GB of cpu RAM alone, and sadly only the $6k Quadros can meet my demand with GPU rendering. So anything under $2500 is a steal for us cgi boys and other professional users. You gamer boys are just wasting money with that extra real estate trust me.
 
Last edited:
Back
Top