Fermi GTX480 is broken and unfixable

I'm confused also...did you quote the wrong post? :D

No I was just sort of agreeing with you that it's not really gonna take some massive leap for Nvidia to comeback. Realistically (Please god don't let me jinx myself) Nvidia can just rebadge/refine the GTX285 and and sell it at a more competitive price to comeback into the market. Please god don't let nvidia read this.
 
The guy is right about the release date (which, if I remember correctly, Kyle predicted this before he did) and all of a sudden he's the golden pillar of unbiased information? Please, this is sensationalist journalism at its finest. The people who blindly follow this guy (90% of what he says about Fermi is correct? Really, show me how any of you who said that came up with this number) are just as bad as the people who say everything he spouts is BS. He takes known facts and spins them to generate more traffic to his site. You don’t need to know much about computer architecture to know that a 3 billion transistor chip is going to run hot, coupled with the fact that TSMC’s 40nm process has long been known to be utter fail up until recently, it’s not hard to predict that Fermi was going to be a tough build. It doesn’t take “unnamed industry insiders” and “anonymous sources” to make these predictions. A little imagination and an internet connection, and boom, you’ve got your very own sensationalist blog.. I mean news website.


BTW, Nvidia has already come out with a response to this fine piece of journalism. Take it for what it's worth (both Charlie and nvidia, for that matter…):

http://twitter.com/Igor_Stanek/status/9243031134



http://twitter.com/Igor_Stanek/status/9242904593

Problem is that Stanek is PR person and total moron ( I met him and all info I saw from him in past 6 months was total BS).

 
What has he been right about other than fermi was being delayed?

I'd have to go back through his articles but the manufacturing shortages, 2% yield problem, fake card shown to the press. I haven't actually seen anything he's been wrong about. I agree he's 60-70% logical assumption but he's approaching Nostradamus status to the point where I think he's got somebody handing him inside memo's.
 
Charlie is very anti Nvidia, but that doesn't mean that he doesn't have some pretty good sources. is he right all the time? No. Is he right often enough to make him a real journalistic presence? yes. Is he fair? no. Do any of these facts negate each other? NO.

So when people bitch about "take it with a grain of salt" sorry it is necessary. He is not fair and impartial. he has his own agenda to push. Equally when people claim that its all lies, well, sorry guys but his record stands better then a LOT of the mainstream journalist out there.

I in no way expect people to see things my way but it surprises me that so many go to such an extreme on either side. reasonable people should look at an issue from all sides as much as we are able to. to me that puts charlie as an interesting source that needs to be taken with a little salt.
 
I agree he's 60-70% logical assumption but he's approaching Nostradamus status to the point where I think he's got somebody handing him inside memo's.

I wouldnt be surprised if he had source inside nVidia - unhappy employees are often very good sources of such informations

 
No I was just sort of agreeing with you that it's not really gonna take some massive leap for Nvidia to comeback. Realistically (Please god don't let me jinx myself) Nvidia can just rebadge/refine the GTX285 and and sell it at a more competitive price to comeback into the market. Please god don't let nvidia read this.

I really don't understand why they DIDN'T do this. in fact they could have done this the way ATI used the 4770. they didn't need 10.1 or even 11 at the time and could have saved themselves a lot of heartburn.
 
LOL, I love how he explains the die error issues that effect yields, when the fact remains that all nVidia needs is one operational chip to demo, and we would see if Fermi is anything worth persuing. But all we had from nVidia is a homegrown structured demo on a specific application. That should tell you more than "Fermi is having yield issues". If the chip was a fully functional GPU, they would have had some standard benchmarks on current games in their demo, but they didn't. The problem is not just the yields, they are having software issues as well, due to the new architecture no doubt.

Charlie is giving you nothing new in that article, we all new last year that nVidia would have trouble getting usable yields and keeping heat and power under control, thus delaying the chips. All he is saying is they still haven't fixed it. Well DUH, there are no more official numbers coming out or leaks, so I could have told you that.

Just more FUD from the nVidia FUDmaster.
 
Problem is that Stanek is PR person and total moron ( I met him and all info I saw from him in past 6 months was total BS).

That's why I said this:

Take it for what it's worth (both Charlie and nvidia, for that matter…)

I would say neither source is reliable... but there are those who will believe what they want to believe and choose one of them then claim it as proof that what they believe is without a doubt right, instead of looking at the bigger picture. Both sides be damned, I say.
 
(edit: yes, I misread. Skip ahead)

So he's saying nVidia can get almost 1M functional chips out with already bought wafers at the current (problematic) yields. Do they really need more?

I mean, sure, it'll be Geforce FX/dustbuster all over again, but if they can put ~1M cards in the channel and move on, maybe that'll be good enough? No, not to make a profit or anything, but as a 'cut our losses' move where they don't actually have to publicly admit defeat. Get the cards out, hype them, sell them at a huge loss. Move on.

Of course, that just pushes the problem in front of them if they're so screwed by the process that they can't produce cut down mass-market parts, which is really what the article is talking about when it's saying "unmanufacturable".

Interesting times anyhow.
 
Last edited:
I'd have to go back through his articles but the manufacturing shortages, 2% yield problem, fake card shown to the press. I haven't actually seen anything he's been wrong about. I agree he's 60-70% logical assumption but he's approaching Nostradamus status to the point where I think he's got somebody handing him inside memo's.

the fake card we all knew was a fake card, all of the other things are speculation and all are basically just saying fermi will be delayed.
 
So he's saying nVidia can get almost 1M functional chips out with already bought wafers at the current (problematic) yields. Do they really need more?

No he isn't, read that again. They can fit 104 chips on a wafer, and they have 9000 wafers. This gives the ~1M chips, but that does NOT mean ~1M functional chips. The rumor is single digit yields, which drops you to a range of 10,000 to 100,000 functional chips (probably closer to the 100,000 than the 10,000 ;) ) that need to be split among multiple segments (GTX 480, 460, Tesla, etc...)
 
So he's saying nVidia can get almost 1M functional chips out with already bought wafers at the current (problematic) yields. Do they really need more?

I mean, sure, it'll be Geforce FX/dustbuster all over again, but if they can put ~1M cards in the channel and move on, maybe that'll be good enough? No, not to make a profit or anything, but as a 'cut our losses' move where they don't actually have to publicly admit defeat. Get the cards out, hype them, sell them at a huge loss. Move on.

Of course, that just pushes the problem in front of them if they're so screwed by the process that they can't produce cut down mass-market parts, which is really what the article is talking about when it's saying "unmanufacturable".

Interesting times anyhow.

That 936K chips is total ammount of chips with Fermi size that fit into 10K waffers that NV already paid for - with 10% yield that is 93K working chips (and only some of them can be used for 480 and Telsa, most will have to be used for 470).

 
I'm starting to think Charlie is being fed at least some of his info from reliable sources. I just wish nvidia had accepted stagnation instead of going psycho with the rebadging. It's giving me a big goddamn headache to see all these laptops coming out with mobile nvidia chips and no one can even tell you conclusively what the f**** specs are.

He isn't always off base. I think he tends to take some truth and then sensationalizes it to death. His previous G92 arguments were proof of that. He claimed they were all defective. That was horseshit as only some of them were. When the article came out about Fermi production yields he was close, but exaggerated the percentages slightly. Actually he nearly cut them in half or something like that as I recall.

The problem with Charlie is it is hard to tell where the truth ends and the sensationalism begins. This is why I dislike his work, though I do understand what he is doing and why.
 
Nvidia purchased $45,000,000 worth of wafers. 104 Fermi dies can fit on each wafer. That means Nvida can make 936,000 dies for $45 million.

Of those 936,000 dies, only a certain percentage of them are functional. That is the yield. If 50% of the dies were functional (468k), then the yield would be 50%

If yields are 20%, then only 187,200 would be functional. If yields are 15% then only 140,400 would be functional.

The problem is Nvidia pays for dies that don't work.
 
That 936K chips is total ammount of chips with Fermi size that fit into 10K waffers that NV already paid for - with 10% yield that is 93K working chips
You're right, I misread him. Should have gone through the advanced 9000*104 math :-\

Well, so much for that then.
 
Assuming this information turns out to be correct, I wonder what it means for anticipated price drops on ATI products across the board, and if there even will be any for a while now with Nvidia sitting on the sideline with this upcoming line of what is rumored to be products that don't compete well with ATI's offering.

Assuming this information all turns out to be true, we'd all do well to consider the implications that could mean for some of us looking to upgrade this year at certain price points.
 
I've said it before but the other thing that this points to is that there's no way in the world a single GPU fermi is going to cost less than $600 retail. And if the numbers in this article are to be believed then it might be more than that, unless nvidia is willing to sell cards for a hefty loss.
 
all this crap is just dinner talk until nvidia finally comes out with something. fanboi-ism or not, the fact is ati has had a fine dx11 solution out in all segments of the market for months now, and the competition has had jack shit.
 
I really don't understand why they DIDN'T do this. in fact they could have done this the way ATI used the 4770. they didn't need 10.1 or even 11 at the time and could have saved themselves a lot of heartburn.

They did do this. Or TRIED to anyway, it was on their roadmap, they just couldn't pull it off. They did finally come out with a couple of year late cut down DX10.1 40nm GT200 based cards, but they cancelled their GT212 series, which was supposed to be their 40nm 260/280 refreshes.

Of course they could have decided to revive the GT212 once they realized Fermi at 40nm was going to be even more unmanufactuable and hopelessly uncompetitive. It's possible they are working on a 40nm GT212+DX11 solution, which would at least be competitive and make money ... assuming they could work out the design/fabrication bugs and actually manufacture it with workable yields.
 
i love how these threads derail more into Charlie then into what the article actually contains..

If what he says is true this could turn into a disaster for Nvidia. I wonder when the next design is set to tap-out for Nvidia. It sounds like even if their next generation design were to be done today it would be at least 6 months before any sort of production could even start. That puts Nvidia into the September range for releasing anything new and profitable.


and what is more, that maybe some forget, is if Nvidia gets say 50k working chips back. These have to be "divided" up between its AIBs. Who gets what?

I'll bet XFX and MSI (and others) are really enjoying the fact they are playing both ATI and Nvidia fields right now. While you have partners like EVGA and BFG who are only doing Nvidia cards.
 
They did do this. Or TRIED to anyway, it was on their roadmap, they just couldn't pull it off. They did finally come out with a couple of year late cut down DX10.1 40nm GT200 based cards, but they cancelled their GT212 series, which was supposed to be their 40nm 260/280 refreshes.

Of course they could have decided to revive the GT212 once they realized Fermi at 40nm was going to be even more unmanufactuable and hopelessly uncompetitive. It's possible they are working on a 40nm GT212+DX11 solution, which would at least be competitive and make money ... assuming they could work out the design/fabrication bugs and actually manufacture it with workable yields.

Problem with this is time - if they decided in January to revive GT212 it would still take at least 6 months to release anything. Its IMO more likely nVidia would do complete respin of Fermi.

 
You're right, I misread him. Should have gone through the advanced 9000*104 math :-\

Rumors abound that Nvidia will only have 5,000 to 8,000 Fermi GF100s, branded GTX480 in the first run of cards. The number SemiAccurate has heard directly is a less specific 'under 10,000'.

editted per eloj
 
Last edited:
BFG and EVGA would really benefit from dropping their nvidia exclusivity.

Yeah it's not a good idea to have a single supplier however I am quite sure the Nvidia exclusive partners made a killing during the G80 golden years. So it's really a hedge.
 
I believe the article, and I believe there are alot of fanboys in denial......I have amd/intel and ati/nvidia, but if it quacks like a duck, and looks like a duck.....by the end of this year, fermi may not even be competing with 5000 series anymore...Nvidia dropped the ball, chit happens. They will rebound.....someday.
 
So now you have a contentless post. I've already explained that if I misread the article and had I done the math (multiplied 104 chips with 9000 wafers) it would have become obvious that the 936K figure for "104 die candidates per wafer, 9,000 wafers" couldn't possibly be with "single digit" yields accounted for.

I realize you're trying to be a standard forum ass hat by assuming I'm still confused about something, but you know what they say about assumptions.

Now are you happy or is there anything else I can help clarify for you?
 
Sigh, how do I already know where this "information" comes from without even reading the OP.. :rolleyes:
 
I really hope Charlie is wrong on this one. I'm really jonesing for a 2nd 5870 now that eyefinity in xfire is fully supported by the next patch. I don't want to spend another 400 bucks tho.
 
I must admit that I'm very surprised at the level of denial that still exists. If the SA article is wrong, be sure to point it out with your (presumably new and informed) facts, not character assassination.

Of course, one could just ask Nvidia for their side of the situation. Kyle, any chance of asking, just for the record/fun?
 
Usualy I might dismiss a lot of this as rubbish, but after reading the AMD article over at anand the other day which went into some depth about the 40nm process problems that AMD had to overcome to get their parts out by actually doing their homework it actually seems this could be true.

Anand said because ATI routinely took the lead in moving to new nodes while Nvidia hung back, ATI had far better engineering/experience resources available when moving to a new node than Nvidia PLUS they were able to work out the bugs at 40nm with the 4770.

So Nvidia started with these two strikes against them, poor new node engineering resources and no pilot chip WHILE attempting to design and manufacture a brand new architecture that was ALSO the largest (most transistors) and most complex ever attempted. This is Hubris writ large. And from a corporate cost/benefit/risk analysis, insane.

This all dovetails nicely with Charlies article and Charlie has, so far, been putting his Fermi arrows straight into the middle of the bullseye.

Far more likely than not, Fermi IS broken and unfixable.

Jen rolled the dice and crapped out.
 
Of course, one could just ask Nvidia for their side of the situation. Kyle, any chance of asking, just for the record/fun?

Only info from NV that I will believe would be if they gave cards to reviewers and ended NDA so we can see real numbers.

 
BFG and EVGA would really benefit from dropping their nvidia exclusivity.

http://www.hardforum.com/showthread.php?t=1491199

Graphics AIB shakeout coming?

--------------------------------------------------------------------------------

This speculation is based on the following data points.

1. Nvidia's consumer Fermi line launching six+ months after AMD's 5000 series.
2. AMD's saying a refresh of the entire Evergreen line will occur in 2H 2010.
3. The growing likelihood the entire consumer Fermi line sales will be relatively anemic AND brutally profit squeezed from the day they are rolled out to the end of 2010 and possibly well into 2011.
4. Intel's move to incorporate graphics on-chip and AMD's Fusion roadmap/dropped hints indicating it will incorporate Evergreen 2H 2011 and Northern Islands 2H 2012 into it's Fusion solutions ~ over time the low-mid range discrete board market will diminish to multi-monitor buyers.

For Nvidia AIB partners, this is not a rosy profit outlook, to say the least. For those partners highly dependent on Nvidia based boards, such as EVGA and BFG, the outlook is even less rosy.

With a very small profit pie to divide up and little to look forward to, the scrabbling over that pie is going to get messy if not downright poisonous. A major shakeout is almost certain.

Meanwhile over at AMD sales are brisk and profit margins fat and AMD in an excellent position to drop a few marginal AIB partners and pick up one or more of Nvidia's premier partners.

The first that comes to mind is THE Nvidia partner, EVGA. AMD adding EVGA as a partner would be a kidney punch to Nvidia, accomplishing:

1. A vote of non confidence in Nvidia by it's premier and 'special' AIB partner.
2. Legitimizing AMD in the eyes of EVGA brandbois and Nvidia fanbois.
3. Eroding Nvidia's market share in general and it's fanboi marketshare in particular.

EVGA's business is SLI Motherboards (now reduced to the Intel camp) and Nvidia based graphics boards, and neither look to be growing or particularly profitable into the future, that as they watch a contemporary, XFX, immediately bite off a sizeable chunk of AMD's already fat margin graphic board market with their pricier, even fatter profit margin boards.

BFG is also a prime candidate as Nvidia graphics boards ARE it's business.

Bidness is bidness.

AMD adding in one or two premier Nvidia only partners while droppping one or more of their more marginal partners, Diamond and Vision-Tek for example, would be brutal for Nvidia.
 
I really hope Charlie is full of crap on this one but he has been quite semiaccurate before about Fermi. And yes I fully understand he is completely in the tank for AMD in my opinion. However.... Yeah this is not good.

And even if he is full of crap Nvidia is feeding RIGHT into it by the lack of news.

BTW Nvidia cant accept defeat with Fermi. They have the card makers howling for new product and the complete loss of 40nm fermi would mean disaster beyond belief for the big green N. They would have to focus entirely on Tesla and Tegra 2 and something tells me they would not be enough to keep the company afloat.
 
They did do this. Or TRIED to anyway, it was on their roadmap, they just couldn't pull it off. They did finally come out with a couple of year late cut down DX10.1 40nm GT200 based cards, but they cancelled their GT212 series, which was supposed to be their 40nm 260/280 refreshes.

Of course they could have decided to revive the GT212 once they realized Fermi at 40nm was going to be even more unmanufactuable and hopelessly uncompetitive. It's possible they are working on a 40nm GT212+DX11 solution, which would at least be competitive and make money ... assuming they could work out the design/fabrication bugs and actually manufacture it with workable yields.

that is what boggles my mind, they HAD to know this was coming, maybe not as bad as it did but surely no other sane company puts this many eggs in a single basket. why not take what you have working and make it profitable? an 850mh GTX285 with a narrower cheaper bus might be some what competitive with 5870. they have been doing that for years with G80. and if you can't make a die shrink of a G92 work why would they try to make something a magnitude more complex? now they are stuck waiting for 28nm to make this work. I don't understand why they dropped this.
 
Back
Top