Did Nvidia Cancel the RTX 3080 20gb and RTX 3070 16gb

if the rumors of Nvidia TSMC 7nm cards in 2021 are true then it makes even less sense to release a Samsung 20GB variant...
 
Huh this is pretty interesting news. Really does make me wonder if it is a GDDR6x supply issue? I mean do the AIB's need Nvidia's permission to release a 20GB version of the 3080?....

Anyway, very interesting development.
 
Huh this is pretty interesting news. Really does make me wonder if it is a GDDR6x supply issue? I mean do the AIB's need Nvidia's permission to release a 20GB version of the 3080?....

Anyway, very interesting development.
3080 is 6X, 3070 is 6. Given other information I have, I do not think this points to VRAM or power component supply issues at all.
 
3080 is 6X, 3070 is 6. Given other information I have, I do not think this points to VRAM or power component supply issues at all.
Well the only other thing it could possibly be is Samsung is having 8NM yield issues then?.....crazy times
 
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.
6gb 1060 vs 480's, 6gb 2060s vs 8gb 5700 - just indicate to me Nvidia if they feel they have the performance lead will cut cost to maximize profits. 10gb Jensen's flagship (lol) 3080 from the start does not show at least in Nvidia's case to worried about perception of the ram amount, now the DDR6x new type of ram was indeed broadcasted in how much better it would be. Looks like AMD gave up that battle with having more ram with the 5500, 5600 when competing with Nvidia cards, being the same amount. I am sure your are 100% correct people will flock more to higher ram cards unless detoured effectively away from them by other marketing methods.
 
Which makes no sense, 4gb more for less performance and every reviewer out there would indicate 20gb has no benefit other than a higher cost . . . Losing position I see it. Nvidia can always put out a 3080 Ti, 12gb, performance up to the 3090 performance depending upon if cut down or not, that makes way more sense. 10%-20% over a 3080, very very little additional cost over a 3080, that I think is Nvidia's play. When has Nvidia ever been worried about vram amount over the competition?

The 20GB on the box makes the casual consumer feel warm inside.

the-guarantee-is-on-the-box.jpg
 
I heard this 20GB/16Gb cancellation rumor about 36 hours ago. I could not find anything concrete to back it up although it did come from a very solid source. I was expecting to get proof last night, but it did not come.

So moving forward assuming this is true, I think the only conclusion is that Samsung is having some very big problems with 8nm yield. Just my 2 cents, you may need change.

Those Samsung yield rumors have been around for a long time and the process is over 3 years old. If yields are still a problem on such an old process Nvidia really fucked up this time.
 
Those Samsung yield rumors have been around for a long time and the process is over 3 years old. If yields are still a problem on such an old process Nvidia really fucked up this time.
I am not repeating a rumor. I am telling you after analyzing the market and the orders placed for parts etc., it very much points to ASIC issues. This is not something I am passing along that I read on some anon forum, but rather giving you my educated opinion after looking the big picture in the market with a lot of other solid data points.
 
I heard this 20GB/16Gb cancellation rumor about 36 hours ago. I could not find anything concrete to back it up although it did come from a very solid source. I was expecting to get proof last night, but it did not come.

So moving forward assuming this is true, I think the only conclusion is that Samsung is having some very big problems with 8nm yield. Just my 2 cents, you may need change.
Did Nvidia get any of TSMC 7nm availability for 2021 or was their plan for 8nm on the entire Ampere lineup? I’ve read rumblings of Nvidia moving to 7nm next year, so maybe this makes sense... I just thought Nvidia didn’t get much of TSMC availability.
 
20GB seemed excessive to me. I have my eye on a 3080, and wouldn't mind more than 10GB, but I'd rather pay a little extra for a 12GB variant than a lot extra for a 20GB variant. I say that not knowing if 12GB is plausible given the memory configuration.
 
Did Nvidia get any of TSMC 7nm availability for 2021 or was their plan for 8nm on the entire Ampere lineup? I’ve read rumblings of Nvidia moving to 7nm next year, so maybe this makes sense... I just thought Nvidia didn’t get much of TSMC availability.
NVIDIA is already producing its Quadros cards at TSMC on 7nm. Besides unfounded rumors, nothing else on moving Ampere production to them. We have NV earnings call coming up very soon. Should be interesting as Jensen will surely throw Samsung under the bus if this rests on it.
 
Those Samsung yield rumors have been around for a long time and the process is over 3 years old. If yields are still a problem on such an old process Nvidia really fucked up this time.

How old is intel's 10nm process at this point?

Rocket lake is still going to be 14nm :D
 
  • Like
Reactions: noko
like this
If true Nvidia might have just taken themselves out of contention for me assuming AMD doesn't have another flop coming. 10GB would be fine on a stopgap card but with how long I keep cards there's no way I'd settle for 10GB right now, especially on a card that powerful and at that price point.
 
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.

As a salesperson of PCs (especially custom gaming PCs) I can back this up.

A person will say "I want this to be real fast, I'll need an 8 gig card!" And I have to explain that the RAM amount has little to do with the performance. Showing the user an RX580 8GB and a RTX 2080 Super 8GB helps them to understand the mootness of VRAM capacity.
 
20GB seemed excessive to me. I have my eye on a 3080, and wouldn't mind more than 10GB, but I'd rather pay a little extra for a 12GB variant than a lot extra for a 20GB variant. I say that not knowing if 12GB is plausible given the memory configuration.

12 GB is plausible if they unlocked the whole memory bus like they did on the 3090

Also, For non-gaming tasks, the more memory the better. The answer to "how much memory do I need" for something like Blender is "as much as you can afford".

I run 128GB of RAM and I regularly have scenes taking up 80+GB when rendering. thee scenes COULD NOT be rendered on anything other than a quadro setup running NVLink to pool memory into one bank over 80GB.

To my shock, if you can't fit the whole scene in RAM, the render just does not work. One of the issues with Path tracing is that any one part of the scene can ray intersect with any other part of the scene, even millions of units away, behind corners, etc. so the entire scene needs to be decompressed, and fully loaded into RAM.
 
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.

Yep. Amount of VRAM is what normal, non [H] people judge graphics cards by.

It's usually in big text on the box, or in the product listing name on amazon, etc. And it's the only spec they recognize and can compare to other graphics cards.

This is what comes up when you search "graphics cards" on amazon.com
1603347880979.png
 
I am not repeating a rumor. I am telling you after analyzing the market and the orders placed for parts etc., it very much points to ASIC issues. This is not something I am passing along that I read on some anon forum, but rather giving you my educated opinion after looking the big picture in the market with a lot of other solid data points.
Yes. While Samsung 8nm should be “mature” by now, it was never meant to be used on such a large chip with such a high power draw. We have to ask ourselves why some can under volt Substantially and get pretty much the same performance. I think some chips just need a ton more voltage than others and Jensen wanted to win the performance battle so they had to over volt to cover for yield issues. That Out of the box power draw is frankly ridiculous given how little overclock headroom you get with adding 100 W more. Most of these chips are right at the edge already.
 
So,a new rumor,about a older rumor,neither of which is confirmed,both of which serve only to create FOMO,and creates more hesitation to buy Ampere,and potentially steers some to wait for RDNA2(16GB Cards).....

Hmmnnnnn, que bono ??
AMD or nVIDIA ? Oh,and its from the "National Enquirer" of PC 'journalism' : Videocardz,going back further to other bigger more well known tabloids on utube,ie the usual suspects.
 
So,a new rumor,about a older rumor,neither of which is confirmed,both of which serve only to create FOMO,and creates more hesitation to buy Ampere,and potentially steers some to wait for RDNA2(16GB Cards).....

Hmmnnnnn, que bono ??
AMD or nVIDIA ? Oh,and its from the "National Enquirer" of PC 'journalism' : Videocardz,going back further to other bigger more well known tabloids on utube,ie the usual suspects.
So the rumor true or not? What is your take exactly besides rumor site telling rumors is bad?
 
You really have to wonder how Nvidia got themselves into this position... although I guess one can look at a company the size of Intel :D

I guess AMD took a chance with TSMC and the relationship has paid off in massive dividends.
 
So the rumor true or not? What is your take exactly besides rumor site telling rumors is bad?


I dont believe it. Why launch cards with huge jumps in memory amount,so very very soon after the initial launch of Ampere? Not to mention the other ,bigger elephant in the room: when the avg game designer plans and builds projects around 4Gb to 6Gb to 8Gb cards? A few Radeon 7 sales will not drive the game designers out there running to make games that can use a 16Gb frame buffer (until a extremely large built in customer base exists),nor will RDNA2 if the VRAM rumors on size are true.They plan based on averages and or lowest common denominator, and the 800 pound gorilla averages 6Gb,not 16. MS FS 2020 being a obvious huge outlier.

AMD sold very few 5700/x's compared to RTX 2K cards.Hopefully RDNA2 is much better and has solid drivers,as the ones I used at launch on my Gigabyte 5700 non XT were garbage,I simply wish to play games not troubleshoot for Multi Billion $ multinational corporations.

I count most of the slowdown in cards getting into folks hands's because of Samsung's location and S.Korea having insane lockdown measures, and add in international border crossing being slown down to a literal crawl the last many months, and international shipping actually getting slower not better in the near future (COVI-PASS).
 
I count most of the slowdown in cards getting into folks hands's because of Samsung's location and S.Korea having insane lockdown measures, and add in international border crossing being slown down to a literal crawl the last many months, and international shipping actually getting slower not better in the near future (COVI-PASS).

If that was the case, Samsung TV's would be hard to buy. Memory modules would be hard to buy. Heck all of the Samsung M.2 drives would be super rare.

Heck Kia's, Hyundai cars would be very rare to buy!......Basically there is no issues with buying any product from a Korean company right now.
 
I’m not surprised. I thought having 20GB variants of the 3080 by the end of the year was too soon. I think that the 20GB variant will come sometime late spring 2021 as a Super refresh. I don’t believe they’re entirely cancelled.

If this is true this might debunk MLID’s conspiracy theory that Nvidia has been hoarding GDDR6X modules for the 20GB variants. I took his conspiracy theory with a massive grain of salt from the beginning since it didn’t make sense to me from a business perspective to artificially cause supply shortages when you have a 1-2 month head start against your competition. Would somewhat make sense if AMD’s offerings aren’t competitive which doesn’t appear to be the case.

In reality I’m much more inclined to believe Nvidia launched the 3080/3090 earlier than planned because AMD is going to have a really competitive product. But due to yield issues at Samsung and probably COVID restrictions in some form, they’re not able to produce sufficient numbers of chips. So they didn’t have large stock to begin with at launch and they’re continuing to have problems.

Would also explain the early CTD issues as the cards definitely seem rushed to the point that Nvidia and AIB partners haven’t been able to determine an appropriate power curve for these cards. Several cards can be undervolted which yields slightly better overall performance and sometimes significantly reduced power consumption and thermals.

Guess we’ll have to wait and see how the rest of this year goes. We’ll see if MLID is credible or if he’s a conspiracy theorist that tries to piece things together for clicks.
 
I’m not surprised. I thought having 20GB variants of the 3080 by the end of the year was too soon. I think that the 20GB variant will come sometime late spring 2021 as a Super refresh. I don’t believe they’re entirely cancelled.

If this is true this might debunk MLID’s conspiracy theory that Nvidia has been hoarding GDDR6X modules for the 20GB variants. I took his conspiracy theory with a massive grain of salt from the beginning since it didn’t make sense to me from a business perspective to artificially cause supply shortages when you have a 1-2 month head start against your competition. Would somewhat make sense if AMD’s offerings aren’t competitive which doesn’t appear to be the case.

In reality I’m much more inclined to believe Nvidia launched the 3080/3090 earlier than planned because AMD is going to have a really competitive product. But due to yield issues at Samsung and probably COVID restrictions in some form, they’re not able to produce sufficient numbers of chips. So they didn’t have large stock to begin with at launch and they’re continuing to have problems.

Would also explain the early CTD issues as the cards definitely seem rushed to the point that Nvidia and AIB partners haven’t been able to determine an appropriate power curve for these cards. Several cards can be undervolted which yields slightly better overall performance and sometimes significantly reduced power consumption and thermals.

Guess we’ll have to wait and see how the rest of this year goes. We’ll see if MLID is credible or if he’s a conspiracy theorist that tries to piece things together for clicks.

Well the MLID theory could still hold true. Right now the rumors are that its a Samsung 8NM issue, and not a memory issue. So, for all we know Nvidia is hoarding all the memory modules, problem would be they dont have enough GPU's from Samsung to put the memory on.
 
12 GB is plausible if they unlocked the whole memory bus like they did on the 3090

Also, For non-gaming tasks, the more memory the better. The answer to "how much memory do I need" for something like Blender is "as much as you can afford".

I run 128GB of RAM and I regularly have scenes taking up 80+GB when rendering. thee scenes COULD NOT be rendered on anything other than a quadro setup running NVLink to pool memory into one bank over 80GB.

To my shock, if you can't fit the whole scene in RAM, the render just does not work. One of the issues with Path tracing is that any one part of the scene can ray intersect with any other part of the scene, even millions of units away, behind corners, etc. so the entire scene needs to be decompressed, and fully loaded into RAM.
No doubt there are use cases where 20GB of VRAM would be great. Those use cases just don't apply to me, and I suspect the majority of people complaining that 10GB isn't enough.
 
Kind of hard to launch unofficial 20GB cards when you can't even fill orders for the 10GB ones. Not to mention; what a dick move it would be to release 20GB models when people are still clamoring to get 10GB models!

Don't matter to me I guess.... I just want a 3090 within the next 6 months damnit! :p
 
Well the MLID theory could still hold true.
Given a lot of the delivery estimates for retailers pushing out to end of November and even into January, it's seeming less likely to be true. But who knows.
 
No doubt there are use cases where 20GB of VRAM would be great. Those use cases just don't apply to me, and I suspect the majority of people complaining that 10GB isn't enough.
Are you talking about right now or in two years because I agree with the former but strongly disagree with the latter. I've had the option to get double the VRAM several times and have done so even though most seemed to think it was useless and at the time it was but by the time I got rid of the cards it was very useful. I realize not everyone keeps cards as long as I tend to these days and there have been some clunker cards with extra memory that was too slow(ie: DDR vs GDDR) but every one I've bought has absolutely been worth it in the end and if I had gone for the lower capacity card I would have needed to replace it much sooner.

The new consoles have 16GB GDDR6 which is shared but I could see graphically intensive games getting 12GB allocated to the GPU which means the 3080 wouldn't even be able to offer the console experience on those titles. Obviously that would be an edge case scenario but even if games allocate it fairly evenly that's still setting the bar at 8GB which is a little too close to 10GB for me considering the performance level that's considered acceptable on consoles.
 
Are you talking about right now or in two years because I agree with the former but strongly disagree with the latter. I've had the option to get double the VRAM several times and have done so even though most seemed to think it was useless and at the time it was but by the time I got rid of the cards it was very useful. I realize not everyone keeps cards as long as I tend to these days and there have been some clunker cards with extra memory that was too slow(ie: DDR vs GDDR) but every one I've bought has absolutely been worth it in the end and if I had gone for the lower capacity card I would have needed to replace it much sooner.
Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.

Given the cost of GDDR6X, I think it's really likely a 20GB version of the 3080 would be priced $300 higher. I don't think the premium would be worth it to me - I'd rather save the $300 and upgrade to a new card in 3-4 years instead of 5-6. I'm sure there's a case to be made that you'd get some of that back in extra resale value, but I'll take the extra savings up front.
 
Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.

Given the cost of GDDR6X, I think it's really likely a 20GB version of the 3080 would be priced $300 higher. I don't think the premium would be worth it to me - I'd rather save the $300 and upgrade to a new card in 3-4 years instead of 5-6. I'm sure there's a case to be made that you'd get some of that back in extra resale value, but I'll take the extra savings up front.
This. I’m ok with 10GB on the 3080 as I’ll be upgrading the GPU once the next generation comes out. During this time I anticipate very few games actually surpassing 10GB VRAM. Especially at the resolution I play; 3440x1440 soon to likely be 3840x1600. But if the price difference is small (less than $200) then I’d pony up for the 20GB.
 
Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.

Given the cost of GDDR6X, I think it's really likely a 20GB version of the 3080 would be priced $300 higher. I don't think the premium would be worth it to me - I'd rather save the $300 and upgrade to a new card in 3-4 years instead of 5-6. I'm sure there's a case to be made that you'd get some of that back in extra resale value, but I'll take the extra savings up front.
I'm far from and expert on the subject but the most convincing numbers I've seen suggest it would cost AIBs an extra $100 but I believe that was spot price not large scale contract price and with how new GDDR6x is the price will drop quickly. That $100 might easily become an extra $200 at retail but I doubt it would add $300, I think $100-$200 extra on an $800 card is more than reasonable if it means the card is more viable in 2 years. I would be much less concerned if I was planning to replace it as soon as something better comes out.

I also think you're underestimating what the new console gen will do to bump up VRAM requirements, the current consoles have half the memory of the next ones and we already have a game or two that perform better with more than 10GB. The new consoles will be running higher resolutions which will use up some of the extra memory but I also think we'll see more of the shared memory used by the GPU this generation. My guess is that in two years VRAM requirements will basically double and I certainly wouldn't want a 5GB card right now, especially if it was the performance level of say the 2080.
 
I'm far from and expert on the subject but the most convincing numbers I've seen suggest it would cost AIBs an extra $100 but I believe that was spot price not large scale contract price and with how new GDDR6x is the price will drop quickly. That $100 might easily become an extra $200 at retail but I doubt it would add $300, I think $100-$200 extra on an $800 card is more than reasonable if it means the card is more viable in 2 years. I would be much less concerned if I was planning to replace it as soon as something better comes out.

I also think you're underestimating what the new console gen will do to bump up VRAM requirements, the current consoles have half the memory of the next ones and we already have a game or two that perform better with more than 10GB. The new consoles will be running higher resolutions which will use up some of the extra memory but I also think we'll see more of the shared memory used by the GPU this generation. My guess is that in two years VRAM requirements will basically double and I certainly wouldn't want a 5GB card right now, especially if it was the performance level of say the 2080.
I'm also no expert but I believe GDDR6 costs Nvidia about $12/GB. We don't know exactly what GDDR6X costs, just that it costs more. Guessing about $15/GB is how I arrived at $300, figured double the cost at retail. I certainly could be wrong, and if it's only a $100-200 premium that would make it much more compelling.
 
I'm thinking NV did cancel their 16 and 20GB SKUs if they were ever planned in the first place. They quit selling the Founders Edition cards themselves and put Best Buy in charge of selling them, so dropping any further planned versions makes sense. As far as their partners making them go, why would they? Can they? At that point NV is just supplying the chips plus technical info and support. If EVGA wanted to make a 20GB 3080 would NV say "NO! You may not try to keep up with Big Navi and its 16GB!"? In other words I think we'll see 20GB 3080s and 16GB 3070s eventually, just not on a Founder's Edition card and not until the board manufacturers think it's in their best interest to do so.
 
I'm thinking NV did cancel their 16 and 20GB SKUs if they were ever planned in the first place. They quit selling the Founders Edition cards themselves and put Best Buy in charge of selling them, so dropping any further planned versions makes sense. As far as their partners making them go, why would they? Can they? At that point NV is just supplying the chips plus technical info and support. If EVGA wanted to make a 20GB 3080 would NV say "NO! You may not try to keep up with Big Navi and its 16GB!"? In other words I think we'll see 20GB 3080s and 16GB 3070s eventually, just not on a Founder's Edition card and not until the board manufacturers think it's in their best interest to do so.
NV partners do nothing like that without full NV approval. Especially EVGA.
 
NV partners do nothing like that without full NV approval. Especially EVGA.
This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.
 
This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.
That will never happen. Nvidia carried/carries a large amount of EVGA's debt. I expect they still do. Nvidia owns EVGA in more ways than one. This has been common knowledge behind the scenes for years. That said, I have never seen absolute proof of this, but very much trust the sources that have given that information. My two cents, you may need change.
 
This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.
Insurance for the $999 MSRP that Nvidia laid out for the 2080Ti is my guess. They don't even make or carry AMD motherboards, that is just throwing money away, maybe they will in the future. If they are for the Pro Enthusiasts and Hard core gamers they should for Zen 3.
 
Insurance for the $999 MSRP that Nvidia laid out for the 2080Ti is my guess. They don't even make or carry AMD motherboards, that is just throwing money away, maybe they will in the future. If they are for the Pro Enthusiasts and Hard core gamers they should for Zen 3.
I read it the same way. They were pulled into an office and told they had to make it no matter what it performed or looked like. I don't think we will live to see EVGA build anything for AMD products. I like my EVGA motherboards and powersupplies, I think this is my 3rd or fourth EVGA video card in the wife's machine.
 
  • Like
Reactions: noko
like this
Back
Top