Fermi Working Samples for CES?

The margins are nice indeed.
Do Nvidia care about revenue or margins? They should decide before making more cards trying to be everything to everyone.
Forget revenue, look at % profit per unit sold and look at the further potential in the rapidly growing GPGPU market.
Where do you think future growth will come from? If you just mean GPGPU will supplant current GPUs...well that doesn't really change anything, does it?
Most of the profit at this point for nVidia and ATi is in mainstream and budget GPUs whereas high-end units are loss-leaders.
....no they aren't. They don't make a loss and aren't intended to make people buy other products. What an odd assertion.
The GPGPU market could potentially reverse or at least more equalize this situation.
I could win millions if I played the lottery.
Future super computers could be constructed out of thousands of cGPUs like Fermi instead of racks of quad-core computers. This has been done once already and so far it's a huge success. Making GPUs more general-purpose only makes this a more attractive option in the future.
Super computers are a niche business.

technogiant said:
Has the 5870 launched?
All this talk about the Fermi delay......If you can't actually buy a 5870....whats the difference?
Ask the people running them?
 
Do Nvidia care about revenue or margins? They should decide before making more cards trying to be everything to everyone.
Last time I checked they were still in business to make money.

Where do you think future growth will come from? If you just mean GPGPU will supplant current GPUs...well that doesn't really change anything, does it?
Last time I checked the GPU market was fairly well saturated. The super computing market is not.

....no they aren't. They don't make a loss and aren't intended to make people buy other products. What an odd assertion.
Actually, high end GPUs are commonly "halo" products. Slightly profitable if that, carrying the development costs for most of the mainstream products that sell orders of magnitude more.

I could win millions if I played the lottery.
Buisness investments and diversification is equivitable to the lottery? Wow.


Super computers are a niche business.
A single supercomputer can cost over 50M. Nvidia only need to sell 10/Quarter to make this "niche" into 1/3 of its total quarterly revenue.


Ask the people running them?
/sigh That's nice, I'd like to go buy one. Last time I checked I can't.

ATI can't take market share if it can't sell it's product. You can't "win the war" without selling a product no matter how good or how cheap your product is.
 
Has supply of the 5870 gone down even more since release? With a little effort I was able to get 2 cards on launch week. Has it really gotten so much worse people are unable to find any cards at all?
 
Has supply of the 5870 gone down even more since release? With a little effort I was able to get 2 cards on launch week. Has it really gotten so much worse people are unable to find any cards at all?

Pretty much.

http://search.zipzoomfly.com/search.aspx?Key=5870
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=2010380048+1305520549+106793261+1067949754&QksAutoSuggestion=&ShowDeactivatedMark=False&Configurator=&Subcategory=48&description=&Ntk=&CFG=&SpeTabStoreType=&srchInDesc=
http://www.mwave.com/mwave/deepsearch_v3.asp?scriteria=5870&ALL=y&TP=

TSMC screwed up a few batches of chips. They make the GPU core. Thus we have this burp in the supply line with no one having parts.
 
Yep, I was honestly considering buying ATI 5000 series, but now, even if I have the cash in hand, they can't sell me anything.

IMO, this plays hugely into NVIDIAs hands. Now, I'm going to sit back and wait for Nvidia to show us the goods.
 
Yep, I was honestly considering buying ATI 5000 series, but now, even if I have the cash in hand, they can't sell me anything.

IMO, this plays hugely into NVIDIAs hands. Now, I'm going to sit back and wait for Nvidia to show us the goods.

Yeah it's one of the most frustrating feelings wanting to give someone your money and they are incapable of being able to give you the product. Sucks for AMD to be in this position because of TSMC
 
Yep, I was honestly considering buying ATI 5000 series, but now, even if I have the cash in hand, they can't sell me anything.

It took 2 weeks to the day since I bought my XFX 5870 to get it. (It's yet to arrive, but due today. "Out for delivery" in fact. :))

I'm too impatient to wait to see what Nvidia has in the works. I have the money now, and want a new GPU so the 5870 wins out this round. (Usually I buy Nvidia. Based on quality at the time of purchase, not on name.)
 
Last edited:
Ati is better...no Nvidia is better... My mom can beat up your mom.

Please people, if Fermi doesn't do well, that won't tank Nvidia. AMD/ATI is in a far worse position than Nvidia, and has been for some time. They are still in business, so one bad run for Nvidia (which btw, has happened before, do we all remember the Radeon 9800 Pro days?) will not break them.

Also all of this nonsense saying GPGPU is a dead technology and isn't profitable is just that, nonsense. GPGPU is a growing industry right now. GPUs are being used more and more in environments all over the place to take advantage of the cheaper cost of running multiple GPUs in parallel to do computing. The entire industry is progressing further in parallel processing, so to say GPGPU is not profitable and not a worthwhile endeavor is just ludicrous. In addition to that, many of the advancements for CUDA also help Nvidia produce better cards for doing physics processing, whether it is through CUDA, PhysX, OpenCL, Tesselation, etc.

And the big kicker here is still the fact that DX 11 is not even being used that much yet. Heck even DX10 was used only sparingly. And DX10.1 was used hardly at all. So what is the benefit of going out to buy a brand new card that is DX11 compatible that doesn't have any games you can play on it? If you can go out and buy an HD4890 or a GTX 275 and play any game out there (and especially now with the 275 PhysX combo card), what is the big deal?

You know why there is a shortage of HD5870s? Because what is the point of mass producing over your market share for a technology that isn't even being used yet? It is just good business sense not to try and stretch yourself too much. Remember earlier AMD/ATI had greatly overproduced stock and it hurt their margins severely. It is smart for them not to make the same mistake twice.

The whole thing about the Nvidia CEO sabotaging TSMC is also just ridiculous even if it is a joke. Why would he sabotage the fabs they are relying on to produce their product? That makes no sense at all. All of this is just typical fear mongering and hate speech. Did Nvidia drop the ball somewhere with Fermi? Sure, something went wrong somewhere in the process, most likely the fact that they were only getting about 2% yields initially from the TSMC fabs which was reported eons ago. Of that 2% they might not have been happy with some of the performance still, so they might have gone back to the drawing board. Perhaps the chips weren't integrating well with the other newer technologies they are producing to do more hardware calculating off of the main GPU chip (another step ahead with major growth implications due to DX11 standards). There are any number of things that go awry with new complex technology. Wisely it seems they are trying to do their best to perfect the card rather than put something out that is half arsed (like the 8000m series of chips).

I think it is hilarious how many rumors go on and that anyone ever listens to Charlie who always has the biggest load of BS out there. Whether he has background knowledge or technical knowledge means nothing when he consistently spits out drivel. Seriously all of his so called inside knowledge if it was accurate would have had Nvidia failing years ago and ATI the sole monopoly on graphics cards. But as it is, that hasn't happened.

As it happens to be right now, I think the Nvidia product delay is good for all of us from a competition standpoint. It gives ATI a little time to get back in and get some more market share in the Graphics market. But with the advances that Nvidia is going for, it could be that they might win back some of the console market and proceed more into the specialized computing market. So everyone should hopefully win here.
 
Last time I checked they were still in business to make money.
Yes of course, but Ferrari and Toyota both make money. They do it in very different ways. Nothing wrong with either approach, but Nvidia don't seem to really know which one they want to take.
Last time I checked the GPU market was fairly well saturated. The super computing market is not.
Even if it isn't, it's not a big market.
Actually, high end GPUs are commonly "halo" products. Slightly profitable if that, carrying the development costs for most of the mainstream products that sell orders of magnitude more.
Carrying them? That would be the complete opposite of a loss-leader...
Buisness investments and diversification is equivitable to the lottery? Wow.
Course not. I was mocking the lack of justification for his/her statement that GPGPU could provide significant revenue. "Nvidia could make significant revenue from GPGPU", "I could win the lottery", "Transformers 3 could be a good movie". See?
A single supercomputer can cost over 50M. Nvidia only need to sell 10/Quarter to make this "niche" into 1/3 of its total quarterly revenue.
Well I assume part of Nvidias pitch is their solution won'tcost 50m, but regardless. There are 10 new supercomputers every quarter? I'm skeptical. Got a cite?
/sigh That's nice, I'd like to go buy one. Last time I checked I can't.

ATI can't take market share if it can't sell it's product. You can't "win the war" without selling a product no matter how good or how cheap your product is.
I was just refuting the nonsense that it wasn't a proper launch. We all know it's TSMC who've screwed the pooch here, not AMD. Obviously it is not good for AMD either.
 
Please people, if Fermi doesn't do well, that won't tank Nvidia. AMD/ATI is in a far worse position than Nvidia, and has been for some time. They are still in business, so one bad run for Nvidia (which btw, has happened before, do we all remember the Radeon 9800 Pro days?) will not break them.
AMD are still going bankrupt Any Day Now?

Short term Nvidia are fine even if Fermi is the mess its looking like being, long term they have serious issues. AMD are OK, given that Intel don't seem interesting in killing them.
You know why there is a shortage of HD5870s? Because what is the point of mass producing over your market share for a technology that isn't even being used yet? It is just good business sense not to try and stretch yourself too much. Remember earlier AMD/ATI had greatly overproduced stock and it hurt their margins severely. It is smart for them not to make the same mistake twice.
Demand is high. To not supply it does not make sense. Especially when a competitor is releasing a product at some point. I'll stick to "TSMC blew it".

You shouldn't assume anyone who says anything bad about Nvidia (or vice versa) is as loony as Charlie, BTW.
 
While i agree everybody should stop wishing each other FUD about one thing or another, AMD's cpu/gpu division actually turned out a $2mill Profit this quarter but if a company is still in the red they need to continue being aggressive, and making the best product possible.
 
I don't disagree with the development of Fermi, but Nvidia should really look at the option of simply starting from the ground up and building a strong and more focused GPU. These two chips should be developed in parallel. i.e) Jack of all trades "Fergie" and Really fast graphics card...

Nvidia's largest market, is probably neither the enthusiast video card wielding gamer or the supercomputer crowd. Remember that Nvidia also has alot of embeded video in laptops and SOHO desktops, OEM boxes, etc...

It could altogether forget about CUDA and just simply develop a DX11 compliant part. To beat AMD at its game. Though this is highly unlikely, as CUDA is Jen Hsun Huang's "Pride-Child".
 
I think it's funny how people can't get it through their heads that GPGPU and gaming aren't exclusive. It's not like a card aimed at GPGPU purposes will suck at gaming or the other way around. Yes, it may be less optimized for one task, but it will still give acceptable performance. How hard is it to accept that nVidia may have optimized Fermi for both GPGPU and gaming tasks? When did we all become experts?
 
Beat AMD at what game? I'm not trying to defend Nvidia here, as they don't exactly have any hardware to properly compete, but we're talking about a paper titan when it comes to AMD hardware. The 5000 series is fantastic hardware. I'd kill to own a 5850 in my machine, but where can you buy one? Not many places to be honest.

Secondly guys, stop acting like the graphic heavyweights are what make AMD and Nvidia the bulk of their cash, because they don't. They do two things: Generate excellent PR, and give gamers like you and me long lasting hard-ons. The majority of the money is made at the bottom end sector, where Intel dominates.

Third, let's talk about Fermi. Nvidia is trying to expand their market into Fermi, because they know that the future of computing is going to be CPU/GPU integration. Let's not forget that Nvidia is competing against both Intel and AMD here, both of which are also CPU makers. Eventually, and that might be 5-10 years from now, a single die will handle both of our CPU and GPU processing. How do you think Nvidia is going to compete in this market if it only has half the playing cards?

Simple. It has to create a market.

But what comes first, the chicken or the egg? In order to create a business that's to be taken seriously, Nvidia has to first create hardware that investors and potential customers see as serious business. They've spent a lot of time creating a software language, a debugging tool, and now they need a real piece of hardware to show off all of this potential. Yes, this is a huge risk for Nvidia here, but at least it's a smart one.

I also see Fermi as a very longterm strategic move to compete on the global market. Intel is busy formulating Larabee, and what do you think a highly parallel x86 graphics core is going to be able to do? Well shit, a lot! It may be pricey, it may be heavy, and it may not be the fastest at gaming, but damn it sure spits out our weather prediction models pretty fast. The point I'm trying to get at here is that if Nvidia doesn't go forward with expanding their market with Fermi, Intel will by default. Why? Because once some bloke figures out how to load and run Windows 7 on a Larabee card, it's going to open the eyes of the entire market and make everybody say, "Hey wait a minute..."

And before you guys lay into me, keep in mind I'm not trying to defend AMD or Nvidia. I'm just trying to look at things long term. Nvidia has been milking the G80 architecture since it's release, and that was like, what, almost 3 years ago or something? (November 8th, 2006). And now they can't get Fermi out on time and everyone thinks they're in deep shit. They might suffer a few quarters, but it's far from going to knock Nvidia out of the game. Even if Fermi fails flat on its arse, which it might, it's not going to knock Nvidia out of the game.
 
Beat AMD at what game? I'm not trying to defend Nvidia here, as they don't exactly have any hardware to properly compete, but we're talking about a paper titan when it comes to AMD hardware. The 5000 series is fantastic hardware. I'd kill to own a 5850 in my machine, but where can you buy one? Not many places to be honest.

Secondly guys, stop acting like the graphic heavyweights are what make AMD and Nvidia the bulk of their cash, because they don't. They do two things: Generate excellent PR, and give gamers like you and me long lasting hard-ons. The majority of the money is made at the bottom end sector, where Intel dominates.

Third, let's talk about Fermi. Nvidia is trying to expand their market into Fermi, because they know that the future of computing is going to be CPU/GPU integration. Let's not forget that Nvidia is competing against both Intel and AMD here, both of which are also CPU makers. Eventually, and that might be 5-10 years from now, a single die will handle both of our CPU and GPU processing. How do you think Nvidia is going to compete in this market if it only has half the playing cards?

Simple. It has to create a market.

But what comes first, the chicken or the egg? In order to create a business that's to be taken seriously, Nvidia has to first create hardware that investors and potential customers see as serious business. They've spent a lot of time creating a software language, a debugging tool, and now they need a real piece of hardware to show off all of this potential. Yes, this is a huge risk for Nvidia here, but at least it's a smart one.

I also see Fermi as a very longterm strategic move to compete on the global market. Intel is busy formulating Larabee, and what do you think a highly parallel x86 graphics core is going to be able to do? Well shit, a lot! It may be pricey, it may be heavy, and it may not be the fastest at gaming, but damn it sure spits out our weather prediction models pretty fast. The point I'm trying to get at here is that if Nvidia doesn't go forward with expanding their market with Fermi, Intel will by default. Why? Because once some bloke figures out how to load and run Windows 7 on a Larabee card, it's going to open the eyes of the entire market and make everybody say, "Hey wait a minute..."

And before you guys lay into me, keep in mind I'm not trying to defend AMD or Nvidia. I'm just trying to look at things long term. Nvidia has been milking the G80 architecture since it's release, and that was like, what, almost 3 years ago or something? (November 8th, 2006). And now they can't get Fermi out on time and everyone thinks they're in deep shit. They might suffer a few quarters, but it's far from going to knock Nvidia out of the game. Even if Fermi fails flat on its arse, which it might, it's not going to knock Nvidia out of the game.

Well yeah, I agree with you on most of your points, but 4-5 weeks ago (this is a deja vu thread by the way) I posted about strategic moves on the part of Nvidia and was chewed up by Vengence, who now more or less agrees with what I wrote back then. Just like a couple pages back in this thread I also mentioned all the stuff about Nvidia dropping each of their product segment prices by one bracket, etc...

Back then I said that I believed that Fermi was a good move for long term on Nvidia's part, but I also felt that they could have simply produced a streamlined DX11 part. DX11 is a specific piece of software. Yes, it's mostly games, but it's an applet more geared for Graphics rather than general purpose computing. Not that Nvidia is wrong at trying, but they shouldn't drop all their eggs into one basket.

Instead of revolutionizing an industry, a quick strike at something with a defined set of game rules (being DX11 and OpenGL) could help to put Nvidia back on top. They could still take their time at developing Fermi properly, while doing this. That's a better long term strategy IMO.

I really hope Nvidia has something to show, even if it has to run at 200MHz to work, just show us some working silicon! Hopefully they won't rush this thing into production and start off on the wrong foot.
 
Last edited:
Nvidia is trying to spread their tech into as many hands as possible. Then when enough people have their tech, they will pay game developers to take advantage of their tech, and surpass amd in some regard. They tried it with physx, though i dont think it was real successful.

Then again i might be wrong.
 
Yes of course, but Ferrari and Toyota both make money. They do it in very different ways. Nothing wrong with either approach, but Nvidia don't seem to really know which one they want to take.
Since they are different markets, there really isn't a problem with both.


Carrying them? That would be the complete opposite of a loss-leader...
If they can produce a GTX 280 for 100$, sell 100,000 o through EOL and spend 10,000,000$ on development, that means their cost per unit is 200$. If they then sell them to an AIB for 180$, who turns around sells them to the retailer at 250$ who turns around and sell them to you at 300$, it is both a loss leader and something that is bearing the brunt of the development costs. (All the numbers here are made up, I kept the super secret inside information I have secret)

Course not. I was mocking the lack of justification for his/her statement that GPGPU could provide significant revenue. "Nvidia could make significant revenue from GPGPU", "I could win the lottery", "Transformers 3 could be a good movie". See?

Well I assume part of Nvidias pitch is their solution won'tcost 50m, but regardless. There are 10 new supercomputers every quarter? I'm skeptical. Got a cite?
I have no idea how many are sold every year. I know most supercomputer lists go out to about 500 for "major" super computers. I also know that in 4.5 years you'll have a comp that is 8 times as powerful based on Moore's law. Rounding that to 5 years, to say a super computer is replaced every 5 years isn't a horrific assumption. And at that rate, you'll see 100 super computers a year ranked 500 and above. Lets say I've way over shot and the replacement rate is every 10 years. that is still 50 per year or 12.5/quarter. Furthermore, lets look at something else.

Assuming Nvidia can produce a single GF100 Tesla package for 500$, using 10,000 of them would give the quivelent of the worlds fastest supercomputer. Lets then take and spend the same on the rest of the sturcture, giving us 10M USD. Remembering that you sell based on what the market can bare, we'll charge 50M for it and net 40M. That gross profit represents 25% of their gross profit from last quarter. It's by no means chump change.

I was just refuting the nonsense that it wasn't a proper launch. We all know it's TSMC who've screwed the pooch here, not AMD. Obviously it is not good for AMD either.
It wasn't a paper launch. It was a soft launch. There were products avaliable to purchase, but not enough products to satisfy demand. TSMC screwed ATI. Just like TSMC screwed Nvidia. It is amazing how much blame gets passed on to TSMC instead of ATI but how much it sticks to Nvidia.
 
I think it's funny how people can't get it through their heads that GPGPU and gaming aren't exclusive. It's not like a card aimed at GPGPU purposes will suck at gaming or the other way around. Yes, it may be less optimized for one task, but it will still give acceptable performance. How hard is it to accept that nVidia may have optimized Fermi for both GPGPU and gaming tasks? When did we all become experts?

because few people will buy cards for other then bang for the buck. and if your selling a massively powerful chip (that both huge and expensive) against a smaller focused competitor your not likely to be winning there. People have been quoting the numbers and saying that the performance will certainly beat the 5870, and it may. but the numbers hardly mean that, and there is almost ALWAYS a penalty for making anything general purpose. and a lot of those transistors that people are talking about are no long there for graphics. just how much and what the penalty is I have no idea but I am worried that it may make it an unviable option to gamers. For the cost it going to have to kick ass in games. and the 5800 already do and are cheaper at launch then what the GTX300 is likely to cost (of course this is just going off of rumor info, haven't seen any cite-able numbers, but still a 512bit GDDR5 bus and huge silicon)
 
looks like some of it is in stock right now as we speak...

OOO looks like newegg got a shipment of Visonteks in. (They were OOS earlier today). Only 30$ over MSRP too! Mwave and zipzoomfly are still all OOS. We'll see how long these last at newegg.
 
OOO looks like newegg got a shipment of Visonteks in. (They were OOS earlier today). Only 30$ over MSRP too! Mwave and zipzoomfly are still all OOS. We'll see how long these last at newegg.

dang only $30 more? I must get one of it now....:p

/sarcasm
 
Fermi may well offer a reasonably priced route to supercomputing but is going to be hard pushed to offer a good performance/price ratio in the gaming world even if it does work out more powerful than Ati's cards.
That said with fermi's more compute orientated architecture it's performance may further improve as physics, AI and who knows what other features move to the gpu. Trouble for Fermi is that those features are still quite a way off yet....and Ati's next architecture due later 2010 I think is also going to be more of a cGPU architecture....perhaps better timing on their part.
 
If I needed a gaming card right now, I'd get a 5870.

nVidia claimed to have mis-judged the 4800 series performance and had to lower prices.
With the 5800's out, nVidia will price according to their performance.
 
Well yeah, I agree with you on most of your points, but 4-5 weeks ago (this is a deja vu thread by the way) I posted about strategic moves on the part of Nvidia and was chewed up by Vengence, who now more or less agrees with what I wrote back then. Just like a couple pages back in this thread I also mentioned all the stuff about Nvidia dropping each of their product segment prices by one bracket, etc...

Back then I said that I believed that Fermi was a good move for long term on Nvidia's part, but I also felt that they could have simply produced a streamlined DX11 part. DX11 is a specific piece of software. Yes, it's mostly games, but it's an applet more geared for Graphics rather than general purpose computing. Not that Nvidia is wrong at trying, but they shouldn't drop all their eggs into one basket.

Instead of revolutionizing an industry, a quick strike at something with a defined set of game rules (being DX11 and OpenGL) could help to put Nvidia back on top. They could still take their time at developing Fermi properly, while doing this. That's a better long term strategy IMO.

I really hope Nvidia has something to show, even if it has to run at 200MHz to work, just show us some working silicon! Hopefully they won't rush this thing into production and start off on the wrong foot.

Chip R&D at the big league like fermi is obscenely expensive. And they probably have their spare hands busy trying to design GTX360/350/...

NV doesn't really have a choice, they need to start burning their war chest to carve out new markets since they aren't going anywhere with their old ones.
 
I'm surprised ATech isn't jumping in on this like a rotton diaper on a baby.

Here's what will happen (my guess): They will have an actual card at CES. No one will be allowed to touch it. No one will be able to see it run benchmarks. In other words, it will be a highly detuned Fermi beta engineering sample until they get the last of the kinks in hardware/software out. Wouldn't be the first time a hardware vendor did this.
 
Here's what will happen (my guess): They will have an actual card at CES. No one will be allowed to touch it. No one will be able to see it run benchmarks. In other words, it will be a highly detuned Fermi beta engineering sample until they get the last of the kinks in hardware/software out.

Well it had better have wood screws in it. Everything I've read about Fermi says they use wood screws. Fermi is a fake if there's no wood screws. :D
 
it may not be them. TSMC is the ones that may be screwing them.
Well they're about half the problem
The longer it takes, the easier it is for AMD the tweak the refresh.
Refresh? more like a brand new architecture.2h 2010(jun-sept)
ugh. Sorry I was thinking hemlock. Just got finished putting together this post (genmay, rant at best buy) and not thinking straight.
that's ok nobody thinks straight bon this forum anyway:D
It's ok, I'm sure Nvidia will shrink and speed up the 9800 GTX+ and call it a GTS 250... errr I mean speed up the GTS 250 and call it a GTS 350. Man, some serious deja vu there.
There is no doubt about this.:D
its the main one but as said above ATI probably did not expect it to be received like it was
Why not? the 4800 series was a hit why not the 5800.:confused:
so I take it I'll be able to buy one in june 2010?!
Boy you're just full of faith today my brother.:D
OOO looks like newegg got a shipment of Visonteks in. (They were OOS earlier today). Only 30$ over MSRP too! Mwave and zipzoomfly are still all OOS. We'll see how long these last at newegg.
will be gone faster than you can say Radeon 5870 FTW!!:D
Looks like you missed your chance. They are out of stock again.
What i just say.Radeon 58...whoops there they go:eek:
Well it had better have wood screws in it. Everything I've read about Fermi says they use wood screws. Fermi is a fake if there's no wood screws. :D
this is true
 
Do you know that means Nintendo softlaunched the Wii for 1-2 years?

Actually, it was only a soft launch in the US. There were pleanty to go around in Europe. The weak supply in the US was created because of the devaluation of the dollar vs the Euro. Nintendo could make less money (or was it loose less money, I don't remember if the Wii was a loss or not) by selling the console in europe because of the fixed price points.
 
Actually, it was only a soft launch in the US. There were pleanty to go around in Europe. The weak supply in the US was created because of the devaluation of the dollar vs the Euro. Nintendo could make less money (or was it loose less money, I don't remember if the Wii was a loss or not) by selling the console in europe because of the fixed price points.

The Wii is a Gamecube with a gimmick controller --- little to no R&D expenditure was required. It's making cash hand over fist, at least on console sales. Software is another story.
 
I will be going to CES every day, I would like to see if nVidia actually can pull this off. If so I am ovbiously going to play with it as well as eyefinity and see which one I am going to buy with my hard earned cash. Will the [H] team be at CES as well? I think it would be cool to meet you guys if you do attend.
 
The Wii is a Gamecube with a gimmick controller --- little to no R&D expenditure was required. It's making cash hand over fist, at least on console sales. Software is another story.

And the 5870 is just a 4870 with double the SPs tacked onto it. Little to no R&D required.

Brilliant.
 
Jepp. They slapped two DX10.1 cores together and got one DX11 out. Without any R&D. 10.1 + 10.1 = 11 Brilliant! :D
 
And the 5870 is just a 4870 with double the SPs tacked onto it. Little to no R&D required.

Brilliant.

Care to bring some actual facts to the table about my assertion? Or are you just going to be a miserable five letter word?
 
And the 5870 is just a 4870 with double the SPs tacked onto it. Little to no R&D required.

Brilliant.

4870 had under 1 billion transistors.
5870 has over 2 billion.

I don't get what you are saying.
 
Care to bring some actual facts to the table about my assertion? Or are you just going to be a miserable five letter word?

Facts? Let's talk about your facts first. Not needing to do any real R&D for a console like the Wii. Not having to work out the details of backwards compatibility, both hardware and software-wise. Not having to go through a zillion iterations of the new controller before ending up with one which works. Not having to work out all the details on the extended API and feature set and developing an SDK and tools.

It must be really nice to live in your world of reductio ad absurdum.

4870 had under 1 billion transistors.
5870 has over 2 billion.

I don't get what you are saying.
Oh yes, they just took two 4870 dies and merged them. Of course. I wonder why they didn't think of that before they released that snail of a 4870, I mean geez, they must be really dumb there at AMD. You really should go work there.

(in the real world a process shrink and adding a bit of logic to an existing circuit can really mess up stuff, requiring months of testing, debugging and reworking stuff to make it work correctly. I should know, one of my best friends is a senior ASIC designer)
 
Facts? Let's talk about your facts first. Not needing to do any real R&D for a console like the Wii. Not having to work out the details of backwards compatibility, both hardware and software-wise. Not having to go through a zillion iterations of the new controller before ending up with one which works. Not having to work out all the details on the extended API and feature set and developing an SDK and tools.

It must be really nice to live in your world of reductio ad absurdum.


Oh yes, they just took two 4870 dies and merged them. Of course. I wonder why they didn't think of that before they released that snail of a 4870, I mean geez, they must be really dumb there at AMD. You really should go work there.

(in the real world a process shrink and adding a bit of logic to an existing circuit can really mess up stuff, requiring months of testing, debugging and reworking stuff to make it work correctly. I should know, one of my best friends is a senior ASIC designer)

Don't become troll-bait dude. They're messin' with you. Don't take it seriously. The 5870 = two 4870 slapped together on the same die, was a sarcasm remark to the guy who said that the Wii was just a gamecube with a new controller. Two or three other posters took it too seriously and things are spiraling out of control... Capiche?

;)

C'mon guys let's get back on topic. How about them Yankees huh? hehehehe.... (crickets chirping at the silent roar of Fermi news)
 
Don't become troll-bait dude. They're messin' with you. Don't take it seriously. The 5870 = two 4870 slapped together on the same die, was a sarcasm remark to the guy who said that the Wii was just a gamecube with a new controller. Capiche?

;)

Yeah, I figured they either had to be n00bs or trolls. I just felt like ranting :D

BTW, I'm a dudette ;)
 
Back
Top