maverick0817
Limp Gawd
- Joined
- May 3, 2009
- Messages
- 295
I think, based on the fact I have no plans for a 30", ill probally end up getting a 5850.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
A die is generally in the ball park of 50$, even if it costs Nvidia 20% more, that increase is only 10$. I just don't see this having as much of an end user impact as they (generally Charlie) touts it as being.
The cost differences was including yield differences. Manufactoring cost does not translate to vendor cost or retail price.Is $60 not more than $50? And that's just assuming we use your numbers. In addition, the size of the die is not the only costs involved. Larger die's come with lower yeilds, I wouldn't be surprised if that alone will cost nVidia more than the actual die size increase.
You also aren't counting the offset from the insane margins coming from Tesla and quadro cards. I can make up lots of reasons for it to be less expensive based on cost as well. Such as economics of scale and differences between Nvidia's contract and ATI's contract with TSMC.Then there is the more expensive 384bit interface in addition to simple marketing price increases if Fermi is faster. Whether you want to admit it or like it, thats all evidence pointing to a more expensive card.
I don't recall saying that I did.You don't have to have official numbers from nVidia to make an educated guess.
Market economics? It's called a flagship. The 260 and 275 never fell out of line on price with the 4780. With all the "4870 is cheaper than dirt to make and the 260/275 are huge dies and cost a fortune" the 4870 and the 260/275 allways stayed in line on price with each other.Perhaps now you can explain why you think it won't me more expensive?
During not-Nvision/GDC, Nvidia was telling people who mattered and AIBs not to expect Fermi until March. Internally it was saying May, but the AIBs were not told that. About the A2 tapeout time, Nvidia's AIB messaging was changed to April or May.
Charlie says no Fermi till May 2010. http://www.semiaccurate.com/2009/11/02/nvidia-finally-gets-fermi-a2-taped-out/
How does he get 3 months from feb quantities till May with vendors?Anyway, if all goes perfectly, we are looking at February for the start of real quantities. There will be A2 silicon before that, but nothing in real quantities. Anyone who says otherwise has ulterior motives or doesn't understand how the industry works.
During not-Nvision/GDC, Nvidia was telling people who mattered and AIBs not to expect Fermi until March. Internally it was saying May, but the AIBs were not told that. About the A2 tapeout time, Nvidia's AIB messaging was changed to April or May.
I'm having trouble following his time line.
How does he get 3 months from feb quantities till May with vendors?
wait for fermi to be released, then you can compare to see which best suites you cost/performance![]()
People who paid $250 for a 5850 on launch got a heck of a deal, because they are all $299 and sold out now.
I wouldn't wait for Fermi. By the time the top-end GTX385/395 are out, AMD will be delivering the 5870 refresh in 32nm, or the Northern Islands HD6000 series. We're talking another 4-6 months now, and it's already November.
I wouldn't wait for Fermi. By the time the top-end GTX385/395 are out, AMD will be delivering the 5870 refresh in 32nm, or the Northern Islands HD6000 series. We're talking another 4-6 months now, and it's already November.
The cost differences was including yield differences. Manufactoring cost does not translate to vendor cost or retail price.
So that makes the next Nvidia card a Fergilicious MILF?
Maybe in these forum walls but in reality, everyone has a budget I would hope...or they go bankrupt...or in debt like our government.
Maybe I'm speaking out of ignorance since I've never had much money. But it seems to me that if you can seriously consider purchasing one of those GTX300 thingimajiggers the day they hit the shelves, losing $50-$100 on a 5850 would be the least of your problems. I could be wrong though, that is a significant amount for most people.
Personaly , if pc gaming software does not improve (games themselves, story line ,etc ) , and quit porting everything ; man I see no reason to upgrade . As it is now my 2x275's are just going to waste with the crap thats getting put out these days .
Iam in no hurry to upgrade till things improve on the other end (and that includes DX11) ....
where the fuck are you nvidia.. anyday now.
Personaly , if pc gaming software does not improve (games themselves, story line ,etc ) , and quit porting everything ; man I see no reason to upgrade . As it is now my 2x275's are just going to waste with the crap thats getting put out these days .
Iam in no hurry to upgrade till things improve on the other end (and that includes DX11) ....
Whoa, now that is bad ass.![]()
If you are the type of person who can afford to go out and buy a 50,000$ car instead of a 20,000$ you are spending 600$ a month extra for that car. If you've got that level of income and instead choose to spend it on computer parts instead, that's 600$ a month of computer parts, or 7200$ a year. Buying a 400$ graphics card every 6 months isn't that big of a deal.
Yes, plus cars are much faster on the pc than your 50,000 dollar car![]()
And I'm almost certain that Nvidia will not be able to match ATI's power efficiency.
that is a nice feature and to be considered but not the deal breaker it is on the mid ranged cards. Most people looking to spend that kind of money are willing to speed the running cost as well.