Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.When has Nvidia ever been worried about vram amount over the competition?
They were probably cancelled and be re-released as Super variants on TSMC.if the rumors of Nvidia TSMC 7nm cards in 2021 are true then it makes even less sense to release a Samsung 20GB variant...
3080 is 6X, 3070 is 6. Given other information I have, I do not think this points to VRAM or power component supply issues at all.Huh this is pretty interesting news. Really does make me wonder if it is a GDDR6x supply issue? I mean do the AIB's need Nvidia's permission to release a 20GB version of the 3080?....
Anyway, very interesting development.
Well the only other thing it could possibly be is Samsung is having 8NM yield issues then?.....crazy times3080 is 6X, 3070 is 6. Given other information I have, I do not think this points to VRAM or power component supply issues at all.
6gb 1060 vs 480's, 6gb 2060s vs 8gb 5700 - just indicate to me Nvidia if they feel they have the performance lead will cut cost to maximize profits. 10gb Jensen's flagship (lol) 3080 from the start does not show at least in Nvidia's case to worried about perception of the ram amount, now the DDR6x new type of ram was indeed broadcasted in how much better it would be. Looks like AMD gave up that battle with having more ram with the 5500, 5600 when competing with Nvidia cards, being the same amount. I am sure your are 100% correct people will flock more to higher ram cards unless detoured effectively away from them by other marketing methods.You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.
Which makes no sense, 4gb more for less performance and every reviewer out there would indicate 20gb has no benefit other than a higher cost . . . Losing position I see it. Nvidia can always put out a 3080 Ti, 12gb, performance up to the 3090 performance depending upon if cut down or not, that makes way more sense. 10%-20% over a 3080, very very little additional cost over a 3080, that I think is Nvidia's play. When has Nvidia ever been worried about vram amount over the competition?
I heard this 20GB/16Gb cancellation rumor about 36 hours ago. I could not find anything concrete to back it up although it did come from a very solid source. I was expecting to get proof last night, but it did not come.
So moving forward assuming this is true, I think the only conclusion is that Samsung is having some very big problems with 8nm yield. Just my 2 cents, you may need change.
I am not repeating a rumor. I am telling you after analyzing the market and the orders placed for parts etc., it very much points to ASIC issues. This is not something I am passing along that I read on some anon forum, but rather giving you my educated opinion after looking the big picture in the market with a lot of other solid data points.Those Samsung yield rumors have been around for a long time and the process is over 3 years old. If yields are still a problem on such an old process Nvidia really fucked up this time.
Did Nvidia get any of TSMC 7nm availability for 2021 or was their plan for 8nm on the entire Ampere lineup? I’ve read rumblings of Nvidia moving to 7nm next year, so maybe this makes sense... I just thought Nvidia didn’t get much of TSMC availability.I heard this 20GB/16Gb cancellation rumor about 36 hours ago. I could not find anything concrete to back it up although it did come from a very solid source. I was expecting to get proof last night, but it did not come.
So moving forward assuming this is true, I think the only conclusion is that Samsung is having some very big problems with 8nm yield. Just my 2 cents, you may need change.
NVIDIA is already producing its Quadros cards at TSMC on 7nm. Besides unfounded rumors, nothing else on moving Ampere production to them. We have NV earnings call coming up very soon. Should be interesting as Jensen will surely throw Samsung under the bus if this rests on it.Did Nvidia get any of TSMC 7nm availability for 2021 or was their plan for 8nm on the entire Ampere lineup? I’ve read rumblings of Nvidia moving to 7nm next year, so maybe this makes sense... I just thought Nvidia didn’t get much of TSMC availability.
Those Samsung yield rumors have been around for a long time and the process is over 3 years old. If yields are still a problem on such an old process Nvidia really fucked up this time.
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.
20GB seemed excessive to me. I have my eye on a 3080, and wouldn't mind more than 10GB, but I'd rather pay a little extra for a 12GB variant than a lot extra for a 20GB variant. I say that not knowing if 12GB is plausible given the memory configuration.
You would be surprised at how big of an impact that VRAM amount has on sales. There is a lot of sales data to support this that I have been privy to over the years.
Yes. While Samsung 8nm should be “mature” by now, it was never meant to be used on such a large chip with such a high power draw. We have to ask ourselves why some can under volt Substantially and get pretty much the same performance. I think some chips just need a ton more voltage than others and Jensen wanted to win the performance battle so they had to over volt to cover for yield issues. That Out of the box power draw is frankly ridiculous given how little overclock headroom you get with adding 100 W more. Most of these chips are right at the edge already.I am not repeating a rumor. I am telling you after analyzing the market and the orders placed for parts etc., it very much points to ASIC issues. This is not something I am passing along that I read on some anon forum, but rather giving you my educated opinion after looking the big picture in the market with a lot of other solid data points.
So the rumor true or not? What is your take exactly besides rumor site telling rumors is bad?So,a new rumor,about a older rumor,neither of which is confirmed,both of which serve only to create FOMO,and creates more hesitation to buy Ampere,and potentially steers some to wait for RDNA2(16GB Cards).....
Hmmnnnnn, que bono ??
AMD or nVIDIA ? Oh,and its from the "National Enquirer" of PC 'journalism' : Videocardz,going back further to other bigger more well known tabloids on utube,ie the usual suspects.
So the rumor true or not? What is your take exactly besides rumor site telling rumors is bad?
I count most of the slowdown in cards getting into folks hands's because of Samsung's location and S.Korea having insane lockdown measures, and add in international border crossing being slown down to a literal crawl the last many months, and international shipping actually getting slower not better in the near future (COVI-PASS).
I’m not surprised. I thought having 20GB variants of the 3080 by the end of the year was too soon. I think that the 20GB variant will come sometime late spring 2021 as a Super refresh. I don’t believe they’re entirely cancelled.
If this is true this might debunk MLID’s conspiracy theory that Nvidia has been hoarding GDDR6X modules for the 20GB variants. I took his conspiracy theory with a massive grain of salt from the beginning since it didn’t make sense to me from a business perspective to artificially cause supply shortages when you have a 1-2 month head start against your competition. Would somewhat make sense if AMD’s offerings aren’t competitive which doesn’t appear to be the case.
In reality I’m much more inclined to believe Nvidia launched the 3080/3090 earlier than planned because AMD is going to have a really competitive product. But due to yield issues at Samsung and probably COVID restrictions in some form, they’re not able to produce sufficient numbers of chips. So they didn’t have large stock to begin with at launch and they’re continuing to have problems.
Would also explain the early CTD issues as the cards definitely seem rushed to the point that Nvidia and AIB partners haven’t been able to determine an appropriate power curve for these cards. Several cards can be undervolted which yields slightly better overall performance and sometimes significantly reduced power consumption and thermals.
Guess we’ll have to wait and see how the rest of this year goes. We’ll see if MLID is credible or if he’s a conspiracy theorist that tries to piece things together for clicks.
No doubt there are use cases where 20GB of VRAM would be great. Those use cases just don't apply to me, and I suspect the majority of people complaining that 10GB isn't enough.12 GB is plausible if they unlocked the whole memory bus like they did on the 3090
Also, For non-gaming tasks, the more memory the better. The answer to "how much memory do I need" for something like Blender is "as much as you can afford".
I run 128GB of RAM and I regularly have scenes taking up 80+GB when rendering. thee scenes COULD NOT be rendered on anything other than a quadro setup running NVLink to pool memory into one bank over 80GB.
To my shock, if you can't fit the whole scene in RAM, the render just does not work. One of the issues with Path tracing is that any one part of the scene can ray intersect with any other part of the scene, even millions of units away, behind corners, etc. so the entire scene needs to be decompressed, and fully loaded into RAM.
Given a lot of the delivery estimates for retailers pushing out to end of November and even into January, it's seeming less likely to be true. But who knows.Well the MLID theory could still hold true.
Are you talking about right now or in two years because I agree with the former but strongly disagree with the latter. I've had the option to get double the VRAM several times and have done so even though most seemed to think it was useless and at the time it was but by the time I got rid of the cards it was very useful. I realize not everyone keeps cards as long as I tend to these days and there have been some clunker cards with extra memory that was too slow(ie: DDR vs GDDR) but every one I've bought has absolutely been worth it in the end and if I had gone for the lower capacity card I would have needed to replace it much sooner.No doubt there are use cases where 20GB of VRAM would be great. Those use cases just don't apply to me, and I suspect the majority of people complaining that 10GB isn't enough.
Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.Are you talking about right now or in two years because I agree with the former but strongly disagree with the latter. I've had the option to get double the VRAM several times and have done so even though most seemed to think it was useless and at the time it was but by the time I got rid of the cards it was very useful. I realize not everyone keeps cards as long as I tend to these days and there have been some clunker cards with extra memory that was too slow(ie: DDR vs GDDR) but every one I've bought has absolutely been worth it in the end and if I had gone for the lower capacity card I would have needed to replace it much sooner.
This. I’m ok with 10GB on the 3080 as I’ll be upgrading the GPU once the next generation comes out. During this time I anticipate very few games actually surpassing 10GB VRAM. Especially at the resolution I play; 3440x1440 soon to likely be 3840x1600. But if the price difference is small (less than $200) then I’d pony up for the 20GB.Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.
Given the cost of GDDR6X, I think it's really likely a 20GB version of the 3080 would be priced $300 higher. I don't think the premium would be worth it to me - I'd rather save the $300 and upgrade to a new card in 3-4 years instead of 5-6. I'm sure there's a case to be made that you'd get some of that back in extra resale value, but I'll take the extra savings up front.
I'm far from and expert on the subject but the most convincing numbers I've seen suggest it would cost AIBs an extra $100 but I believe that was spot price not large scale contract price and with how new GDDR6x is the price will drop quickly. That $100 might easily become an extra $200 at retail but I doubt it would add $300, I think $100-$200 extra on an $800 card is more than reasonable if it means the card is more viable in 2 years. I would be much less concerned if I was planning to replace it as soon as something better comes out.Right now. Yeah, in two years it could maybe be an issue, although I expect it could be something like limited to high textures at 4K instead of ultra in certain games.
Given the cost of GDDR6X, I think it's really likely a 20GB version of the 3080 would be priced $300 higher. I don't think the premium would be worth it to me - I'd rather save the $300 and upgrade to a new card in 3-4 years instead of 5-6. I'm sure there's a case to be made that you'd get some of that back in extra resale value, but I'll take the extra savings up front.
I'm also no expert but I believe GDDR6 costs Nvidia about $12/GB. We don't know exactly what GDDR6X costs, just that it costs more. Guessing about $15/GB is how I arrived at $300, figured double the cost at retail. I certainly could be wrong, and if it's only a $100-200 premium that would make it much more compelling.I'm far from and expert on the subject but the most convincing numbers I've seen suggest it would cost AIBs an extra $100 but I believe that was spot price not large scale contract price and with how new GDDR6x is the price will drop quickly. That $100 might easily become an extra $200 at retail but I doubt it would add $300, I think $100-$200 extra on an $800 card is more than reasonable if it means the card is more viable in 2 years. I would be much less concerned if I was planning to replace it as soon as something better comes out.
I also think you're underestimating what the new console gen will do to bump up VRAM requirements, the current consoles have half the memory of the next ones and we already have a game or two that perform better with more than 10GB. The new consoles will be running higher resolutions which will use up some of the extra memory but I also think we'll see more of the shared memory used by the GPU this generation. My guess is that in two years VRAM requirements will basically double and I certainly wouldn't want a 5GB card right now, especially if it was the performance level of say the 2080.
NV partners do nothing like that without full NV approval. Especially EVGA.I'm thinking NV did cancel their 16 and 20GB SKUs if they were ever planned in the first place. They quit selling the Founders Edition cards themselves and put Best Buy in charge of selling them, so dropping any further planned versions makes sense. As far as their partners making them go, why would they? Can they? At that point NV is just supplying the chips plus technical info and support. If EVGA wanted to make a 20GB 3080 would NV say "NO! You may not try to keep up with Big Navi and its 16GB!"? In other words I think we'll see 20GB 3080s and 16GB 3070s eventually, just not on a Founder's Edition card and not until the board manufacturers think it's in their best interest to do so.
This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.NV partners do nothing like that without full NV approval. Especially EVGA.
That will never happen. Nvidia carried/carries a large amount of EVGA's debt. I expect they still do. Nvidia owns EVGA in more ways than one. This has been common knowledge behind the scenes for years. That said, I have never seen absolute proof of this, but very much trust the sources that have given that information. My two cents, you may need change.This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.
Insurance for the $999 MSRP that Nvidia laid out for the 2080Ti is my guess. They don't even make or carry AMD motherboards, that is just throwing money away, maybe they will in the future. If they are for the Pro Enthusiasts and Hard core gamers they should for Zen 3.This makes total sense when you look at the only $999 2080ti option that wasn't a rebate or sale was the EVGA Black card. I love the customer support over there and would love for them to branch out with AMD products. I would put them at the top of my list if they ever did that.
I read it the same way. They were pulled into an office and told they had to make it no matter what it performed or looked like. I don't think we will live to see EVGA build anything for AMD products. I like my EVGA motherboards and powersupplies, I think this is my 3rd or fourth EVGA video card in the wife's machine.Insurance for the $999 MSRP that Nvidia laid out for the 2080Ti is my guess. They don't even make or carry AMD motherboards, that is just throwing money away, maybe they will in the future. If they are for the Pro Enthusiasts and Hard core gamers they should for Zen 3.