Fermi GTX480 is broken and unfixable

We'll all see if Charlie was right or not very easily - and that's when Fermi comes out. This is one of his biggest angles for 2009 and 2010. He's got a lot riding on his predictions for it's doom and gloom.

He's given a ton of specifics, we can match them up when the actual product arrives if at all. This article does feel like he's been cribbing Anand's notes a little though.
 
Last edited:
http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable/

I couldn't make myself read more than the first few paragraphs of that article. 'Unbiased' is definitely not one of the keywords here.

I don't think Charlie ever have tried to sound unbiased. Its worse with the websites that pretend to be neutral, but review graphic cards after the directions and the games the vendor asks them to. Thats more like ordering a review and the sites dares not to say no, because it will limit their chances of getting review hardware later. Sites that accepts such "The way its ment to be reviewed (Nvidia)" or "get in the review (AMD)" are much worse, since they pretend to be neutral in their review, while being just a PR tool for a vendor.

Charlie is pretty clear in his feelings against Nvidia and never pretended to be neutral. His articles can be taken with a bucket of salt, but they are interesting to read, since he tends to be correct sometimes way before others. Especially when it comes to Fermi.

But, most of the article doesn't matter to the regular user. How much or how little money Nvidia will earn on it shouldn't matter to a gamer. We'll see how much it costs, how it performs and how available it is once it is released.

I don't think Nvidia will release it unless its compatible in price and performance to the point where people will buy Fermi. So far, there is no viable option to ATI in my option and I think that Fermi will change that, regardless if Nvidia will profit on Fermi or not. We will have an alternative to ATI and thats what matters. Choices for users.
 
But, most of the article doesn't matter to the regular user. How much or how little money Nvidia will earn on it shouldn't matter to a gamer. We'll see how much it costs, how it performs and how available it is once it is released.

I don't think Nvidia will release it unless its compatible in price and performance to the point where people will buy Fermi. So far, there is no viable option to ATI in my option and I think that Fermi will change that, regardless if Nvidia will profit on Fermi or not. We will have an alternative to ATI and thats what matters. Choices for users.

Since you commented this article doesn't matter much to the regular user, I'd agree as the first Fermi's will be high-end parts that won't be in an average/regular users price range. However, for hardware enthusiasts I would say this article is a bit relevant.

You also brought up the issue of costs, how it performs, etc which is pretty much what this article pertains to regarding lower clocks due to manufacturing issues, shaders possibly being disabled for additional yields, etc. In addition, brought up is the issue of the wafer cost and the single digit yields.

I'm curious what the cost of manufacturing a single Fermi card will end up being. There was reference to roughly $50 million dollars spent on wafers that could yield up to 936k chips with 100% yield. If we assume they have solved their manufacturing problems and now can produce yields into the doubt digits like say 20% which would be twice what Charlie's estimating. You'd have 187k chips for a potential 187k fermi cards is still $267.00 per chip not including the heatsink, packaging, circuitry, marketing, samples for reviewers and price factoring in expected warranty-costs. At that rate, it seems Fermi could be both fairly profitable and reasonably priced.

If Charlie is correct though yields are still single digit though such as 9.9%(the highest reasonable single digit yield), you'd be at over $500.00 per chip as stated in the article which would set Fermi's initial price pretty high. If they wanted a 20% profit margin, you'd have a $600.00 card pricing setting it on-par with the 5970 but I'm doubtful if it would perform better than a 5970 /w current benchmarks showing it less powerful, the possiblity of lowered clock speeds and the possibility of less shaders than originally intended.

I'm hoping turning off the 32-64 shader solution discussed in the article might increases yields by double such as from from 9% to say 18% so the initial costs of the chips can be lower and competitive with ATI even if it means Fermi is less powerful. If the chips were only $267.00 and they wanted a 20% profit margin, you'd only be looking at around $320-350 putting it closer to onscale with the 5850's pricing. If it performed as well as the 5870 even after the slight-scaling back on shaders, you'd have quite a nice price-war going on.
 
As an enthusiast, wouldn't you be more interested in the actual product, its performance and price once it is released vs. how it got there and Nvidia's yield?

If Nvidia prices the card to be competing with similar options from ATI, do you really care how much money Nvidia earns on it? Nvidia won't price it higher then what people are willing to pay for it anyway.

An enthusiast normally cares about what he can get within a certain price and not so much about how much a vendor earns on it. If the price is right, he buys it, if not, he doesn't.
 
That would be nice, however, Nvidia chips have traditionally been very expensive at launch. The 8800GTX when released was what, like $5-600? Even now the GTX280 is still close to $400.00. I just don't see this happening.
 
huh?

at their predicted 40% yield. ATI is $92 a chip(sounds high to me) predicted on N is $121 a chip.
And that is just the chip. someone saying 267 and a 20% profit margin is around 320 to 350 close to 5850 pricing ???????

add the board, memory, sinc packaging and vendor profits its looking like $500 plus. Which I believe is what people were saying back in Dec. At that time some of them were also saying 40% faster than a 5870.
Man if they did that with cardboard, I can't wait to see the real card.

personally I think in aug they were thinking about the new 2 fps faster model 299SUPER GTX at $500 and another generation card in May of 2010.
Then ati dropped the 5870 and N went into the old lie and paper mode
 
That would be nice, however, Nvidia chips have traditionally been very expensive at launch. The GTX when released was what, like $600? Even now the GTX 280 is still close to $400.00. I just don't see this happening.

If people don't buy it at $600 (provided that this will actually be the price), they will lower prices towards a point where people will buy it. The GTX 280 is still $400 because people will still pay that price for it. Considering the low supply of it, they can charge that price to the few morons left that actually would pay $400 for that card and not risk burning in with a lot of surplus cards.
 
I'm willing to bet that all units will be sold at the price Nvidia decides, people might moan at a $600 price but they will get picked up. Fastest GPU (if it is, and probably yes) is all the high end ehtusiasts need to see to be sold on it and almost any price.
 
I can't believe people are still going on about yields. Yields don't and won't matter, even if they only get 2% yields, they would still produce enough cards to put a top performer on the market, which is all they are shooting for this year (at least the first half anyway). They just want the performance title as always, to market the brand name, they make their money on the mid to low end chips (GPUs anyway).

What you are all missing, is they HAVE working samples and have had them for quite some time, but we have yet to see any numbers leaked. That should tell you something. Either the chip is a dog, and can't clock worth a damn, making the numbers so low they can't possibly market them. Or there is a major software issue with the new pipeline, making the chip a dog in current applications.

As far as EVGA goes, people don't buy their hardware soley because of SLI, it's their build quality, customer support, warranty, and step-up program. So as long as those remain, EVGA will be head and shoulders above the rest.
 
I can't believe people are still going on about yields. Yields don't and won't matter, even if they only get 2% yields, they would still produce enough cards to put a top performer on the market, which is all they are shooting for this year (at least the first half anyway). They just want the performance title as always, to market the brand name, they make their money on the mid to low end chips (GPUs anyway).
From a company financial standpoint pretty much right on. Flagships are halo products.

What you are all missing, is they HAVE working samples and have had them for quite some time, but we have yet to see any numbers leaked. That should tell you something. Either the chip is a dog, and can't clock worth a damn, making the numbers so low they can't possibly market them. Or there is a major software issue with the new pipeline, making the chip a dog in current applications.
Not really. Nvidia is ALLWAYS tight lipped about preformance before launch. 8800, 9800, 280, it's all the same. You're lucky if you see real preformance numbers a week before launch. It's just SOP for them.

As far as EVGA goes, people don't buy their hardware soley because of SLI, it's their build quality, customer support, warranty, and step-up program. So as long as those remain, EVGA will be head and shoulders above the rest.
Yep.
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

Here's what Charlie wrote: "This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104."

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

Here's what Charlie wrote: "This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104."

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.

wtf is the book of gammer?
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

Here's what Charlie wrote: "This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104."

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.

really? grammer nazi?

welcome to the internet, people have bad grammer.
 
...wow...I cant remember the last time a piece of kit generated so much drama in the last 10 or 15 years haha. Dammit NV, I just want a shiny new badass card! =P
I'm thoroughly enjoying the drama. The last 12 months of tech news seems to have been centered on the latest telephones (yawn), latest telephone operating systems (double yawn) and latest itty-bitty, barely capable, netbooks (nuclear yawn). Charlie D. has added much spice to tech reporting, at least in the GPU arena.

WabeWalker, I would suggest you judge Charlie on his merits, that is, his record. I've been reading him for years. He's accurate far more often than not. I'm sure, that if he just copied Nvidia's PR blather (the PR spinners have lofty college degrees, after all), he'd get his possessives right. But then, it's all academic, isn't it? ;)
 
I'm a copyeditor, and..........book of gammer. I was taught about possessive 's' literally in the second grade.

As opposed to figuratively? Also, copy editor is two words. Nice job mister fancy pants grammar nazi. :D

Of all the factors someone could use to discredit an author's information ON THE INTERNET grammar should be about the last. Charlie has proven himself accurate enough on this subject that giving his articles an objective read is still worthwhile.
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

Here's what Charlie wrote: "This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104."

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.
You might as well stop using the internet altogether. Bad grammar on the internet is an immutable fact of life.
 
As opposed to figuratively? Also, copy editor is two words. Nice job mister fancy pants grammar nazi. :D

hahaha nice. Im absolutly terrible at grammer and spelling but it doesnt mean im bad at other things. Notably math... which is about as usefull on the internet as grammer.
 
I can understand his point. A "professional" journalist or anyone trying to appear as one should use better grammar than your typical forums poster. Though the writing style shouldn't be used as an excuse to automatically disregard what's being written. That is unless he's wR1+1Ng LiK3 +H1s.
 
One would think a grammar nazi would at least know how to spell 'grammar' or at the very least spell check (which is built-in) his rant . A copy editor no less.. fail.
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

...

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.

:p

You've basically stated that you have no credibility. Bravo.
 
From a company financial standpoint pretty much right on. Flagships are halo products.
QFT. That said, where is Nvidia's 5770, 5750, 5670, 5570, and 5450 killers? These are going to be AMD's real bread and butter and there isn't even talk of anything to compete with them from Nvidia.
 
QFT. That said, where is Nvidia's 5770, 5750, 5670, 5570, and 5450 killers? These are going to be AMD's real bread and butter and there isn't even talk of anything to compete with them from Nvidia.

Coming Q2 2010 according to a press release made today. Q1 we'll see some supply and Q2 well see production hit a 'pull strike' aka 'full stride' and a full range of Fermi offerings including 4 different cards. I gather one of those 4 cards will be a 5770/5750/5670 competittor. It sounded like from their press statement that we might not see lower end cards till Q2010 which would be kind of ...sad if that's the case as there won't be a lot of competition. Honestly, the quote below made me feel like 'late' Q2 is more realistic. It didn't seem optimistic with the CEO saying "mainstream ... graphics cards hardly demand new functionality ... not crucial... to update"

"Nvidia’s chief executive officer did not provide any concrete timeframes for the transition of the whole lineup to the new Fermi architecture, but said that since the owners of mainstream and entry-level graphics cards hardly demand new functionality, it is not crucial for Nvidia to update currently available “fabulous” graphics chips. In addition, the speed of the transition depends on the supply of 40nm chips by TSMC."
 
.....

"Nvidia’s chief executive officer did not provide any concrete timeframes for the transition of the whole lineup to the new Fermi architecture, but said that since the owners of mainstream and entry-level graphics cards hardly demand new functionality, it is not crucial for Nvidia to update currently available “fabulous” graphics chips. In addition, the speed of the transition depends on the supply of 40nm chips by TSMC."

Looks like late Q2 or even Q3 for mainstream GPUs from NV :mad:

 
Looks like late Q2 or even Q3 for mainstream GPUs from NV :mad:


Actually I highly doubt Nvida is even thinking about 40nm Entry level and "Mainstream" is likely just a slightly reduced femi at sill high prices.

With the way Fermi is set up. scaling down at 40nm will absolutely kill performance. No its going to be 28nm for Mainstream and Entry level. Just like the 8800GTX days where the move to 65nm finally enabled 8 series mainstream and entry.
 
Yep. when asked about entry level derivatives they essentially responded that the stuff they have now is good enough. I would be very surprised to see entry level Fermi on 40nm.
 
That is one of the most silly things I have EVER heard.

Do you have any idea how markets work? Are you enjoying paying 300USD for an upper mainstream card? How about 400USD to make it more enjoyable when ATI is the sole serious Graphics card maker on the market.

Now no need to worry. Even as an ATI fan I enjoyed using their 55nm parts. And 55nm is getting cheaper and cheaper to make. Worst comes to worst Nvidia can skip 40nm alltogether and focus on flooding the market with more 55nm stuff.

Well if he meant the nvidia way its meant to be played lets turn off all the features for AMD cards for no reason aside form being jerks then I agree with him Nvidia deserves to suffer.
 
Insanity: doing the same thing and expecting a different result.

Nvidia: see insanity.
 
I'm a copyeditor, and one thing I have noticed, is that when a person can't write, it's also a sign that this person has no credibility.

Here's what Charlie wrote: "This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104."

This guy needs to look up possessive 's' in a book of gammer. I was taught about possessive 's' literally in the second grade. His entire article is full of grammatical errors of this nature. It tells me a lot about him - chiefly, that I should ignore him.

sorry but for the type of writer that he is I don't think this applies at all, his understanding of the tech he is reporting on and the ability to convey that to the masses is what really matters. and on that front he does well. his being bias is a different matter. no one listens to kyle, dan, mark, johnny, or OklahomaWolf for their garmmar abilities.
 
I believe the post that says N announced today that MAY will be the release date.
Of course they didn't say what year.
 
Actually I highly doubt Nvida is even thinking about 40nm Entry level and "Mainstream" is likely just a slightly reduced femi at sill high prices.

With the way Fermi is set up. scaling down at 40nm will absolutely kill performance. No its going to be 28nm for Mainstream and Entry level. Just like the 8800GTX days where the move to 65nm finally enabled 8 series mainstream and entry.

You can look at Fermi being 8 parrallel sections, if you take a bad chip that has 1 bad section you seal it off and make a bad "480" into a good 470. Seal off another section and you've got something that is 75% the preformance. Rinse and repeat until you get all the way down to something that is 1/2-1/8th the preformance based on actual yields. Doing so lets you reuse the "scrap" chips and suddenly your 480 dies don't cost as much or you say they do and your derivitive dies are "free".
 
Well this thread has been interesting. I think the way to handle Charlie is with any rumor site, you take what he has to say..mull it over then see what makes sense (like one say we new a very large chip was going to be tough to make and run hot). Check to see if you have other independent reports. Then check to see if you see any pattern, if you do, then there is probably some truth somewhere close by....
 
Back
Top