BSN nVidia GF100 Fermi silicon cost analysis, more positive then before

Good.

I don't want to pay $650 for 8800 GTX / 9800GX2 / GTX 280 / Nvidias current gen this time.
 
Thats pretty cool. That supports my doubts of GF100's struggling with price/performance ratio. If they're eating the loss now, then even when yields increase, they might not drop prices as fast to make up for their "losses"

I'm starting to wonder what ATI is planning to counter also. I guess they're just waiting for more solid info from GF100 before they announce any refresh/price drop.

March can not come any sooner!
 
I'm sure Charlie D will find a way to put a negative spin on it.

Seems like Fermi is not "unmanufacturable" after all.
 
It ought to be noted tho that TSMC has recently fixed the yield issue which ought to decrease the die cost even further. A gift from above for Nvidia.

One thing to learn here tho folks. BIG dies are NOT the answer going forward. You gain FAR FAR FAR more money using your 5 thousand dollar runs to make boatloads of Tegra chips which have many times smaller die size.

Globalfoundries is only making the huge 28nm dies in their tests runs to scare Nvidia or to make a 6890 launch product to smash Nvidia at the turn. However this is unsustainable in the long run because it requires yields to be extremely high.

I fully expect both companies in 2011 to focus on the smaller dies and cheaper cards like a 200USD Fermi and ATI 6670s 6770s and 6830s you get many more dies per waffer.
 
BTW the article is BS. If its to be believed. ATI is getting shafted like no other geting charged more than double per waffer than Nvidia.
 
That story doesn't paint a pretty picture for nvidia. If that is more positive than before, the news from "before" must have been pretty terrible.
 
What is important for the price of the die is the end price for the consumer. While nVidia will find tests in which it beats both the single and dual-GPU HD 5870 and the HD 5970, going with the $599 price tag would be suicide in this economy. We expect to see a price of $549 or $499 for the high end part, probably named "GeForce 380" and $349-399 for the "GeForce 360".

mmm.. competition...
 
BSN said:
In case of nVidia, when 60% is the best you can expect and 40% is a realistic goal, you're ending up with the 240x240mm die costing $87, but realistically you're looking at $131 per chip. $131 isn't all that bad, given the current cost of single AMD Cypress die is $96. For as long as you're under $100 per die, you can count your blessings with TSMC's 40nm process.

According to semi-accurate:
Charlie said:
Since GF100 silicon at 40 percent yields will cost about $125 for the raw unpackaged silicon versus a consensus number of less than $50 for ATI's Cypress
.
If by launch the 5870 is a 300$ card (I'm assuming a price drop here) adding the price of 35$ difference to get a 335$ price tag you would need GF100 to come out at ~12% faster to match price/preformance. By charlies numbers, you'd need it to be priced at 375$ to 300$ and be 25% faster to match price/preformance. With the truth lying somewhere in the middle and the conservative estimates on speed saying it should easily be 20% faster than a 5870 I see no reason Nvidia CAN'T match price/preformance of the 5870. It'll be interesting to see thier launch pricing strategies. Expect heavy price cuts from ATI.

Now ATI launched a shot at Nvidia last year by drastically cutting price. It would be very interesting if Nvidia launched a shot this year. Could you imagine the chaos that would insue if the GF100 launched at a loss/breakeven price of 350$ or even 300$?
 
BSN said:
Tuth to be told, given that GF100 kills both HD 5870 and HD 5970 in Tessellation and consumes less power than HD 5970

????

did i miss something?

and then you get this

BSN said:
going with the $599 price tag would be suicide in this economy. We expect to see a price of $549 or $499

so 599$ is suicide but 50$ cheaper is all of a sudden not suicide?

if its faster than the 5970 and consumes less power than the 5970 why would 599$ be suicide but not 549$?


that article bounces back and forth quite a bit, this article is all over the place, not even sure what the whole point of it was after reading it.

it starts out as a cost analysis but not sure what the ended with, the last 2 paragraphs are just an oddity.
 
????

did i miss something?

and then you get this



so 599$ is suicide but 50$ cheaper is all of a sudden not suicide?

if its faster than the 5970 and consumes less power than the 5970 why would 599$ be suicide but not 549$?


that article bounces back and forth quite a bit, this article is all over the place, not even sure what the whole point of it was after reading it.

it starts out as a cost analysis but not sure what the ended with, the last 2 paragraphs are just an oddity.
Yeah there are some oddities in the last 2 paragraphs.

600$ would be stupid as you could never launch a dual GPU unit based on price alone. If it is going to cost within 50$ of the 5870 to make then you price it within 50$ of the 5870 and scuttle ATI's market. People are going to have a serious shock factor when it comes to a 600$ price tag.
 
600$ would be stupid as you could never launch a dual GPU unit based on price alone.

?? Didn't stop the GTX 280. Of course they will price drop before releasing a dual-gpu card, you can't sell a card at $600 for the entire generation.
 
There has been 600USD single GPUs before that is nothing new.

As for some kind of magical 400 or 300 range card. This is not like baking a bunch of pies and selling a bunch of them cheap to cause chaos. They only get so many waffers per month. TSMC has been going at max for years now. And it takes time to make a waffer.

If you get say 2000 cores you can sell for the month. All 2000 will sell... What price will you set 400 or 600USD espeically as your bread and butter lineup has been decimated....

Its going to be 550 up. The yield fixes meaning they will be able to sell more cards but there is no way fermi is going to price drop or go low or anything other than super expensive for atleast 4 months.
 
Its NVIDIA, i wont beleive it to be any cheaper then $599 if it is their flagship card until i see it

but NVIDIA is also taking a beating from the 5 series and a bit from the 4 series so maybe they will price it smart!
 
Now ATI launched a shot at Nvidia last year by drastically cutting price. It would be very interesting if Nvidia launched a shot this year. Could you imagine the chaos that would insue if the GF100 launched at a loss/breakeven price of 350$ or even 300$?

Wishful thinking, no way in hell, that is going to happen, nvidia partners have been soaked dry for months, no way they are gonna wanna make 5 dollar profit on a card like this, they will be shooting for 50-75 per card and that is very least probably, and I will be surprised if the card is priced below 500 dollars.

and as far as ati is concerned, watch the hd 5870 to go down to 330 or 300 without even breaking a sweat, the yield issues are nearly eliminated and there is no shortage, and ati will keep making crap load of profit until nvidia comes out and still make decent profit while selling the cards at cut down prices, the hd 5850 might be selling at 200-250 a few months from now.
 
Lets be honest if volumes are low and performance is good ppl will pony up $600 to have the latest green monster. AMD is hoping for as long a delay as possible so they can move units at current msrp. All smoke and mirrors from Nvidia atm, no ones going to move on a G100 without decent mesurable results except the fan boi's who's blood is green.(which is fine where volumes are low).
 
Last edited:
Wishful thinking, no way in hell, that is going to happen, nvidia partners have been soaked dry for months, no way they are gonna wanna make 5 dollar profit on a card like this, they will be shooting for 50-75 per card and that is very least probably, and I will be surprised if the card is priced below 500 dollars.

and as far as ati is concerned, watch the hd 5870 to go down to 330 or 300 without even breaking a sweat, the yield issues are nearly eliminated and there is no shortage, and ati will keep making crap load of profit until nvidia comes out and still make decent profit while selling the cards at cut down prices, the hd 5850 might be selling at 200-250 a few months from now.

I would watch the 5830 and 5670 they can REALLY eat Nvidia's lunch by lowering the price on those cards.

And Nvidia fans would gladly pay even 700 USD for one of the first batch of cards out. Thankfully for Nvidia tho they will be able to launch with more than 2000 cards and will able to price them lower. However I highly expect a 599 price tag. They are going to be marketing it as a CUDA and Fold@Home monster and it will sell.

But yes ATI is ready with huge price cuts up their sleeves. 5830 just looks silly at its current price and I expect that to drop like a rock. Perhaps to as far down as 110 or so.
 
Made up news is still made up.

There is nothing to see here...

Nothing to see if you can't make your own analysis of what the article states combined with the various near-always accurate posters on a few different forums... ;). Mush all the info together, weed out the bad, and you have a pretty good picture well in advance of the "official announcements". I guess in your world, Fermi didn't exist until a day or so ago when nVidia released the "deep dive" slides from NDA. This article in particular doesn't contain anything new though and is fairly poorly edited.
 
Close: Fermi doesn't exists until it's finalized and in capable reviewers hands. Right know it's just a tech demo of what could be...

Like the N64 before it was released...anyone else remember those Final Fantasy graphics that were out of this world?

finalfantasy641.jpg
 
Last edited:
Close: Fermi doesn't exists until it's finalized and in capable reviewers hands. Right know it's just a tech demo of what could be...

Like the N64 before it was released...anyone else remember those Final Fantasy graphics that were out of this world?

... I can't hold it in my hands, therefore it cannot be? Really? Really....? No, we don't have the finalized part's performance #'s from independent reviewers yet, but we know a huge amount about its architecture, specs, rough targeted performance range (5870-5970+), scalability downward due to how it is laid out internally, RAM capacities that will be available from nVidia whitepapers, auxillary features such as triple-display gaming (in SLI) and optional "3d" gaming, etc. etc. We know it is targeted to use a high level of power as well, and that it is currently being aimed for around 2-2.5 months from now, from a variety of reliable "leakers" who have virtually always been spot-on in the past. So, it's a far cry from a CG rendering still in a magazine ;). Really all we lack is EXACT power usage, and EXACT clock rates (which will determine EXACT performance that we'll see from independent reviewers and ultimately in our own systems), but even those are able to be speculated upon with reasonable accuracy by enthusiast-level users at various forums. I'm sure plenty will go "BUT IT'S NOT OFFICIAL!!!!!", but this is how it always goes.
 
Last edited:
????

did i miss something?

and then you get this

The new nvidia architecture (according to the white paper released this week) is going to be much stronger at pushing polygons than the previous architecture.

The next quote is from anandtech
To put things in perspective, between NV30 (GeForce FX 5800) and GT200 (GeForce GTX 280), the geometry performance of NVIDIA’s hardware only increases roughly 3x in performance. Meanwhile the shader performance of their cards increased by over 150x. Compared just to GT200, GF100 has 8x the geometry performance of GT200, and NVIDIA tells us this is something they have measured in their labs.
http://anandtech.com/video/showdoc.aspx?i=3721&p=3

Of course will these increased focused on geometry be wasted resources (die space) or not will be dependent on the games in the next couple years. Furthermore we don't know any real world results of the GF100, we don't even know the clocks of the part yet. Thus speculation on the performance is like having a blind man be an art critic.
 
I'm not saying we don't know a lot. I'm saying what we do know can change. Thus all speculation is still just speculation at this point. That wasn't just renders. It was videos, at E3, it even had a playable demo at one point.

I wouldn't be surprised to see fermi's specs change, in fact I'm counting on it. Most information we have will be right, but until we know for sure the finalized specs, speculating the price seems illogical to me. Then again game demos aren't usually indications of games coming out...

Do you see what I'm saying?
 
Less than 5 days before the 5700 series came out, all the rumor sites and some of the review sites were convinced the that the 5700 series parts were going to have 1120 shaders.

We barely know anything right now.
 
Wishful thinking, no way in hell, that is going to happen, nvidia partners have been soaked dry for months, no way they are gonna wanna make 5 dollar profit on a card like this, they will be shooting for 50-75 per card and that is very least probably, and I will be surprised if the card is priced below 500 dollars.

and as far as ati is concerned, watch the hd 5870 to go down to 330 or 300 without even breaking a sweat, the yield issues are nearly eliminated and there is no shortage, and ati will keep making crap load of profit until nvidia comes out and still make decent profit while selling the cards at cut down prices, the hd 5850 might be selling at 200-250 a few months from now.

It's not wishful thinking. It isn't even that close to a reality. The 5870 won't dip to 300$. There just isn't a reason to. Yield or no yield, they aren't going to drop 100$ off the price it undercuts the rest of the high margin, mid range cards if they do.

As for the nvidia partners, people have still been buying the 285s just fine at their out rageous price tags. Don't believe me? Just look at prices that AREN'T falling. Retailers drop prices when they need to move things. They havn't been droping the price on the 285. There is only one conclusion that can come from that.

I would be very surprised if Nvidia launched GF100 at more than 450$ unless it warrants that price because of linear price/preformance relationship to the 5870 while it is priced at 325$ish.
 
I'm not saying we don't know a lot. I'm saying what we do know can change. Thus all speculation is still just speculation at this point. That wasn't just renders. It was videos, at E3, it even had a playable demo at one point.

I wouldn't be surprised to see fermi's specs change, in fact I'm counting on it. Most information we have will be right, but until we know for sure the finalized specs, speculating the price seems illogical to me. Then again game demos aren't usually indications of games coming out...

Do you see what I'm saying?

Absolutely, and I do agree on that :).
 
...

I would be very surprised if Nvidia launched GF100 at more than 450$ unless it warrants that price because of linear price/preformance relationship to the 5870 while it is priced at 325$ish.

Price and performance relationship is not linear at the high end. Nevertheless, I agree that Fermi's price must be within a reasonable range in order to be competitive. I hope that there will not be any collusion between NVIDIA and ATI to keep the price artificially high...
 
It's not wishful thinking. It isn't even that close to a reality. The 5870 won't dip to 300$. There just isn't a reason to. Yield or no yield, they aren't going to drop 100$ off the price it undercuts the rest of the high margin, mid range cards if they do.

As for the nvidia partners, people have still been buying the 285s just fine at their out rageous price tags. Don't believe me? Just look at prices that AREN'T falling. Retailers drop prices when they need to move things. They havn't been droping the price on the 285. There is only one conclusion that can come from that.

I would be very surprised if Nvidia launched GF100 at more than 450$ unless it warrants that price because of linear price/preformance relationship to the 5870 while it is priced at 325$ish.

Nvidia has all but stopped production of their GTX 200 series gpus. They just are not competitive. Their cards are inferior at every price point above $150. It seems they'd rather make a small profit on the cards that actually sell than having to subsidize each and every card they sell. Some people are going to buy green anyway and those are NVs only customers left really.

Also, does anyone know the cost of a 5870 compared to a 4870? It seems that they might have a similar cost to make so I don't think AMD would have a problem lowering the price on it significantly if Fermi was indeed priced competitively.
 
Nvidia has all but stopped production of their GTX 200 series gpus. They just are not competitive. Their cards are inferior at every price point above $150. It seems they'd rather make a small profit on the cards that actually sell than having to subsidize each and every card they sell. Some people are going to buy green anyway and those are NVs only customers left really.

Also, does anyone know the cost of a 5870 compared to a 4870? It seems that they might have a similar cost to make so I don't think AMD would have a problem lowering the price on it significantly if Fermi was indeed priced competitively.

The 5870 will cost between $150-$200 to create. The 5850 is closer to $150. There is a $180 profit based into the msrpg of the 5870 currently.

Costs can also go down btw. As yields go up they will be able to reduce the voltage the chip needs and reduce the cooling needs and power needs and they can reduce the price of the whole package greatly.

The hemlock currently costs between $250 and $300 to create.
 
That story doesn't paint a pretty picture for nvidia. If that is more positive than before, the news from "before" must have been pretty terrible.

it was, and while still not pretty its not looking like the blood letting we were thinking. all the same there are some points of contention in this article. for one the compared cost of the ATI chips is probably off, I am think that someone was thinking of two for the 5970
 
The only thing that strikes me with Fermi is that its a card designed for 28nm sitting on 40nm. The article is BS but you do get the basic point about how much loss you get from having such huge dies.

40nm as big as it is is just a sideshow in my opinion I think the real battle is going to be 28nm Fermi 28s against ATI 6x its going to be the battle of the early decade.
 
it was, and while still not pretty its not looking like the blood letting we were thinking. all the same there are some points of contention in this article. for one the compared cost of the ATI chips is probably off, I am think that someone was thinking of two for the 5970

It is a blood letting. They are totaly off on ati's prices.

The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)

The 5870 is less $150 according to the same sources.

That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.

But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.
 
It is a blood letting. They are totaly off on ati's prices.

The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)

The 5870 is less $150 according to the same sources.

That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.

But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.

The board breakdown must be complete rubbish - it's very similar to a GTX 2xx series board only with a narrower memory bus (making it cheaper) but DDR 5 (more expensive). Being as GTX 260's were selling for $150 and the card manufacturers will have been making a profit those boards can have cost no more then $75 tops.

But now all of a sudden a very similar board costs $280 ... realistically the GTX 360 board is highly unlikely to cost more the $100, probably less.
 
Last edited:
article is wrong. Cypress yields and size are wrong. So are costs . Cypress costs less than $50 per chip at this point for amd.

The 5870 will cost between $150-$200 to create. The 5850 is closer to $150. There is a $180 profit based into the msrpg of the 5870 currently.

Costs can also go down btw. As yields go up they will be able to reduce the voltage the chip needs and reduce the cooling needs and power needs and they can reduce the price of the whole package greatly.

The hemlock currently costs between $250 and $300 to create.

It is a blood letting. They are totaly off on ati's prices.

The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)

The 5870 is less $150 according to the same sources.

That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.

But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.

Your prices are wrong because I say they are. I my tabloid sorce that I'm not going to bother linking is better than your tabloid source! Seriously? Give us a break.
 
The new nvidia architecture (according to the white paper released this week) is going to be much stronger at pushing polygons than the previous architecture.

The next quote is from anandtech

http://anandtech.com/video/showdoc.aspx?i=3721&p=3

Of course will these increased focused on geometry be wasted resources (die space) or not will be dependent on the games in the next couple years. Furthermore we don't know any real world results of the GF100, we don't even know the clocks of the part yet. Thus speculation on the performance is like having a blind man be an art critic.

I was under the impression that unified shaders in DX10 fixed this problem. If a shader unit on a modern video card can handle a pixel shader program or a vertex shader program, how exactly can you judge performance with the wide split like Anand did?

I mean, if you have the same number of processors available for pixel and vertex programs, why can't you use them however you well please? Unless there is a significant overhead processing vertex programs versus pixel (or less performance per clock), I would think the potential performance increases would be IDENTICAL. I find the Anandtech article hard to believe because of this.

EDIT: or is Nvidia specifically referring to Geometry Shader power, which comes into effect after the vertex shader stage? Geometry shaders have more to do with effects like tessellation, and thus the marketing literature SOUNDS good, because the reality is this: Nvidia has never spent much in hardware toward geometry shaders. So, now they're playing catch-up, and to make it sound better, they've spun it as good PR (we've made a huge investment in the future!)
 
Last edited:
AMD is sitting in an extremely favorable and commanding position at this point.

They have higher yielding, cheaper to manufacture chips which have been available in the market for months. Also developers have had these cards for nearly a year now. A huge installed base (2 million and counting) coupled with being the only DirectX 11 development platform only spells good things for those users who chose the HD5xxx series. This is very similar to how the Xbox360 decimated the PS3 as the development platform of choice because it was available first and it was built around an industry wide standard API.

Knowing this. The 5xxx series will not come down in price until the refresh parts are launched. What they will do, instead, is launch a new middle of the road SKU (HD5830), and release drivers that improve performance across the board and add or enhance features (eyefinity bezel management and 3d glasses support are examples). Nvidia's answer to Eyefinity reeks of being a hacked up last minute attempt to fill a checkbox and say "Hey, we have it too!" but upon closer inspection there are so many barriers of entry to be able to experience it (Expensive high refresh monitors, expensive glasses, SLI, etc etc) that in the end it will only benefit a smaller number of users than those who went or will go with Eyefinity.

They will also open up board design to partners so they can come up with their own custom overclocked versions. The MSI HD5870 Lightning Edition is a sign of things to come.
This way they will saturate all of the price tiers with choices for all types of users and will squeeze Nvidia out of the middle and force them to compete at the ultra high end which is a market incredibly small compared to the bread and butter bringing mainstream and performance market. I'm not going to say much about the low end because there is no point. Once the low end versions of the HD5xxx family are out, Nvidia will not be able to compete in price, nor performance, nor features.

Other than that, AMD does not need to do much else. They can simply sit and watch Nvidia run themselves ragged trying to squeeze as much performance as they can out of their low yielding, hotter running, more expensive to manufacture chip to justify the price tag it will sell at while they just keep adding to their 2 million installed user base every day.
 
Last edited:
AMD is sitting in an extremely favorable and commanding position at this point.

They have higher yielding, cheaper to manufacture chips which have been available in the market for months. Also developers have had these cards for nearly a year now. A huge installed base (2 million and counting) coupled with being the only DirectX 11 development platform only spells good things for those users who chose the HD5xxx series. This is very similar to how the Xbox360 decimated the PS3 as the development platform of choice because it was available first and it was built around an industry wide standard API.
Last I checked, both cards run DX11. PS3 and Xbox comparison is no relevant in the least.

Knowing this. The 5xxx series will not come down in price until the refresh parts are launched. What they will do, instead, is launch a new middle of the road SKU (HD5830), and release drivers that improve performance across the board and add or enhance features (3d glasses support is one example). Nvidia's answer to Eyefinity reeks of being a hacked up last minute attempt to fill a checkbox and say "Hey, we have it too!" but upon closer inspection there are so many barriers of entry to be able to experience it (Expensive high refresh monitors, expensive glasses, SLI, etc etc) that in the end it will only benefit a smaller number of users than those who went or will go with Eyefinity.
Nvidia does not require 120hz monitor or 3d glasses to run eyefinity. It requirest 120hz monitors and 3D glasses to run 3D eyenfinity. Futhermore, telling someone who just bought 3x 24" monitors and plans to game at 3600x1920 or 5760x1200 or higher that SLI is a "barrier" is a bit... dumb.

If AMD adds 3D support, they will still need 120Hz monitors.

I fully expect ATI to drop prices at the Nvidia launch date.

They will also open up board design to partners so they can come up with their own custom overclocked versions. The MSI HD5870 Lightning Edition is a sign of things to come.
Last time I checked Nvidia had lots of these. For example here is EVGA's classified 285.


This way they will saturate all of the price tiers with choices for all types of users and will squeeze Nvidia out of the middle and force them to compete at the ultra high end which is a market incredibly small compared to the bread and butter bringing mainstream and performance market.I'm not going to say much about the low end because there is no point. Once the low end versions of the HD5xxx family are out, Nvidia will not be able to compete in price, nor performance, nor features.
Read the reviews, the G92 despite being touted as "old tech" still keeps up with the 5770 just fine. Infact, in several case it was faster.

Other than that, AMD does not need to do much else. They can simply sit and watch Nvidia run themselves ragged trying to squeeze as much performance as they can out of their low yielding, hotter running, more expensive to manufacture chip to justify the price tag it will sell at while they just keep adding to their 2 million installed user base every day.
Yep, Nvidia is going out of buisness tommorow. :rolleyes:
 
Back
Top