http://www.brightsideofnews.com/news/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx
interesting if it pans out
interesting if it pans out
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
What is important for the price of the die is the end price for the consumer. While nVidia will find tests in which it beats both the single and dual-GPU HD 5870 and the HD 5970, going with the $599 price tag would be suicide in this economy. We expect to see a price of $549 or $499 for the high end part, probably named "GeForce 380" and $349-399 for the "GeForce 360".
BSN said:In case of nVidia, when 60% is the best you can expect and 40% is a realistic goal, you're ending up with the 240x240mm die costing $87, but realistically you're looking at $131 per chip. $131 isn't all that bad, given the current cost of single AMD Cypress die is $96. For as long as you're under $100 per die, you can count your blessings with TSMC's 40nm process.
.Charlie said:Since GF100 silicon at 40 percent yields will cost about $125 for the raw unpackaged silicon versus a consensus number of less than $50 for ATI's Cypress
BSN said:Tuth to be told, given that GF100 kills both HD 5870 and HD 5970 in Tessellation and consumes less power than HD 5970
BSN said:going with the $599 price tag would be suicide in this economy. We expect to see a price of $549 or $499
Yeah there are some oddities in the last 2 paragraphs.????
did i miss something?
and then you get this
so 599$ is suicide but 50$ cheaper is all of a sudden not suicide?
if its faster than the 5970 and consumes less power than the 5970 why would 599$ be suicide but not 549$?
that article bounces back and forth quite a bit, this article is all over the place, not even sure what the whole point of it was after reading it.
it starts out as a cost analysis but not sure what the ended with, the last 2 paragraphs are just an oddity.
600$ would be stupid as you could never launch a dual GPU unit based on price alone.
you can't sell a card at $600 for the entire generation.
Now ATI launched a shot at Nvidia last year by drastically cutting price. It would be very interesting if Nvidia launched a shot this year. Could you imagine the chaos that would insue if the GF100 launched at a loss/breakeven price of 350$ or even 300$?
Wishful thinking, no way in hell, that is going to happen, nvidia partners have been soaked dry for months, no way they are gonna wanna make 5 dollar profit on a card like this, they will be shooting for 50-75 per card and that is very least probably, and I will be surprised if the card is priced below 500 dollars.
and as far as ati is concerned, watch the hd 5870 to go down to 330 or 300 without even breaking a sweat, the yield issues are nearly eliminated and there is no shortage, and ati will keep making crap load of profit until nvidia comes out and still make decent profit while selling the cards at cut down prices, the hd 5850 might be selling at 200-250 a few months from now.
Made up news is still made up.
There is nothing to see here...
Close: Fermi doesn't exists until it's finalized and in capable reviewers hands. Right know it's just a tech demo of what could be...
Like the N64 before it was released...anyone else remember those Final Fantasy graphics that were out of this world?
????
did i miss something?
and then you get this
http://anandtech.com/video/showdoc.aspx?i=3721&p=3To put things in perspective, between NV30 (GeForce FX 5800) and GT200 (GeForce GTX 280), the geometry performance of NVIDIAs hardware only increases roughly 3x in performance. Meanwhile the shader performance of their cards increased by over 150x. Compared just to GT200, GF100 has 8x the geometry performance of GT200, and NVIDIA tells us this is something they have measured in their labs.
Wishful thinking, no way in hell, that is going to happen, nvidia partners have been soaked dry for months, no way they are gonna wanna make 5 dollar profit on a card like this, they will be shooting for 50-75 per card and that is very least probably, and I will be surprised if the card is priced below 500 dollars.
and as far as ati is concerned, watch the hd 5870 to go down to 330 or 300 without even breaking a sweat, the yield issues are nearly eliminated and there is no shortage, and ati will keep making crap load of profit until nvidia comes out and still make decent profit while selling the cards at cut down prices, the hd 5850 might be selling at 200-250 a few months from now.
Good.
I don't want to pay $650 for 8800 GTX / 9800GX2 / GTX 280 / Nvidias current gen this time.
I'm not saying we don't know a lot. I'm saying what we do know can change. Thus all speculation is still just speculation at this point. That wasn't just renders. It was videos, at E3, it even had a playable demo at one point.
I wouldn't be surprised to see fermi's specs change, in fact I'm counting on it. Most information we have will be right, but until we know for sure the finalized specs, speculating the price seems illogical to me. Then again game demos aren't usually indications of games coming out...
Do you see what I'm saying?
...
I would be very surprised if Nvidia launched GF100 at more than 450$ unless it warrants that price because of linear price/preformance relationship to the 5870 while it is priced at 325$ish.
http://www.brightsideofnews.com/news/2010/1/21/nvidia-gf100-fermi-silicon-cost-analysis.aspx
interesting if it pans out
It's not wishful thinking. It isn't even that close to a reality. The 5870 won't dip to 300$. There just isn't a reason to. Yield or no yield, they aren't going to drop 100$ off the price it undercuts the rest of the high margin, mid range cards if they do.
As for the nvidia partners, people have still been buying the 285s just fine at their out rageous price tags. Don't believe me? Just look at prices that AREN'T falling. Retailers drop prices when they need to move things. They havn't been droping the price on the 285. There is only one conclusion that can come from that.
I would be very surprised if Nvidia launched GF100 at more than 450$ unless it warrants that price because of linear price/preformance relationship to the 5870 while it is priced at 325$ish.
Nvidia has all but stopped production of their GTX 200 series gpus. They just are not competitive. Their cards are inferior at every price point above $150. It seems they'd rather make a small profit on the cards that actually sell than having to subsidize each and every card they sell. Some people are going to buy green anyway and those are NVs only customers left really.
Also, does anyone know the cost of a 5870 compared to a 4870? It seems that they might have a similar cost to make so I don't think AMD would have a problem lowering the price on it significantly if Fermi was indeed priced competitively.
That story doesn't paint a pretty picture for nvidia. If that is more positive than before, the news from "before" must have been pretty terrible.
it was, and while still not pretty its not looking like the blood letting we were thinking. all the same there are some points of contention in this article. for one the compared cost of the ATI chips is probably off, I am think that someone was thinking of two for the 5970
It is a blood letting. They are totaly off on ati's prices.
The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)
The 5870 is less $150 according to the same sources.
That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.
But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.
article is wrong. Cypress yields and size are wrong. So are costs . Cypress costs less than $50 per chip at this point for amd.
The 5870 will cost between $150-$200 to create. The 5850 is closer to $150. There is a $180 profit based into the msrpg of the 5870 currently.
Costs can also go down btw. As yields go up they will be able to reduce the voltage the chip needs and reduce the cooling needs and power needs and they can reduce the price of the whole package greatly.
The hemlock currently costs between $250 and $300 to create.
It is a blood letting. They are totaly off on ati's prices.
The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)
The 5870 is less $150 according to the same sources.
That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.
But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.
The new nvidia architecture (according to the white paper released this week) is going to be much stronger at pushing polygons than the previous architecture.
The next quote is from anandtech
http://anandtech.com/video/showdoc.aspx?i=3721&p=3
Of course will these increased focused on geometry be wasted resources (die space) or not will be dependent on the games in the next couple years. Furthermore we don't know any real world results of the GF100, we don't even know the clocks of the part yet. Thus speculation on the performance is like having a blind man be an art critic.
Last I checked, both cards run DX11. PS3 and Xbox comparison is no relevant in the least.AMD is sitting in an extremely favorable and commanding position at this point.
They have higher yielding, cheaper to manufacture chips which have been available in the market for months. Also developers have had these cards for nearly a year now. A huge installed base (2 million and counting) coupled with being the only DirectX 11 development platform only spells good things for those users who chose the HD5xxx series. This is very similar to how the Xbox360 decimated the PS3 as the development platform of choice because it was available first and it was built around an industry wide standard API.
Nvidia does not require 120hz monitor or 3d glasses to run eyefinity. It requirest 120hz monitors and 3D glasses to run 3D eyenfinity. Futhermore, telling someone who just bought 3x 24" monitors and plans to game at 3600x1920 or 5760x1200 or higher that SLI is a "barrier" is a bit... dumb.Knowing this. The 5xxx series will not come down in price until the refresh parts are launched. What they will do, instead, is launch a new middle of the road SKU (HD5830), and release drivers that improve performance across the board and add or enhance features (3d glasses support is one example). Nvidia's answer to Eyefinity reeks of being a hacked up last minute attempt to fill a checkbox and say "Hey, we have it too!" but upon closer inspection there are so many barriers of entry to be able to experience it (Expensive high refresh monitors, expensive glasses, SLI, etc etc) that in the end it will only benefit a smaller number of users than those who went or will go with Eyefinity.
Last time I checked Nvidia had lots of these. For example here is EVGA's classified 285.They will also open up board design to partners so they can come up with their own custom overclocked versions. The MSI HD5870 Lightning Edition is a sign of things to come.
Read the reviews, the G92 despite being touted as "old tech" still keeps up with the 5770 just fine. Infact, in several case it was faster.This way they will saturate all of the price tiers with choices for all types of users and will squeeze Nvidia out of the middle and force them to compete at the ultra high end which is a market incredibly small compared to the bread and butter bringing mainstream and performance market.I'm not going to say much about the low end because there is no point. Once the low end versions of the HD5xxx family are out, Nvidia will not be able to compete in price, nor performance, nor features.
Yep, Nvidia is going out of buisness tommorow.Other than that, AMD does not need to do much else. They can simply sit and watch Nvidia run themselves ragged trying to squeeze as much performance as they can out of their low yielding, hotter running, more expensive to manufacture chip to justify the price tag it will sell at while they just keep adding to their 2 million installed user base every day.