More Fermi Rumors - Not Good

Yeah, I don't think it's as serious/dire as 3dFX's situation just yet. Both ATi and Nvidia have had some lousy cards [relative to each other] in the past, but they seem to always bounce back within a generation or two. That's a good thing for us consumers. :)

ATI's last 'bounce back' happened after it got taken over, for what that's worth.

Not saying I'd want to buy NVDA, but acquiring it before the release of its next gen architecture would be a good time. Kind of like EA with Take Two Interactive.
 
is this a repeat of 2001? is'nt this the kind of the same thing that happened to 3dFX when they floundered? granted nV has more going for it then 3dFX did but not that much.. ION can't be the savior once video is integrated onto the CPU die for all of those low power pc's.

No not really, 3dfx mainly went under because they blew all their savings buying STB which turned out to be a dog, so when shit started going bad trying to get the last voodoos right and out the door there was no slack at all and they couldn't keep running.

Nvidia isn't in the same position.

If, after all this, fermi turns out to be shit (very unlikely), or nividia just can't make money off them because they cost too much to make (not quite as unlikely) or ATI has already saturated the market (assuming they can achieve volume before nvidia can start giving out good news), things may change.
 
When the boss says it would be cake to add DX11 to current G200 parts, in the face of a catastrophically delayed DX10.1 part with new architecture and doesn't even bring up Femri in that same interview, I'm thinking of a disillusioned Ken Kutaragi.

:eek:

Jen-Hsun Huang = Ken Kutaragi (IMO)

Both of these guys are delusional. Smart, but delusional.

Totally agree on that thought of yours. ;)
 
Nvidia you clever bastards timing your release with income tax refund season instead of what by all accounts will be a very poor holiday shopping season. Just when we all get our nice income tax refunds (State of California residents need not apply) Nvidia will saturate the market and undercut AMD's pricing. Bravo Nvidia.
Hahaha, nice. Just think how much they can charge with the 8000$ new home tax credits the goverenment is handing out!

3dfx? Nvidia purchased them, and then somehow they died off.
Aureal? Creative Labs bought them and you know what happened afterwards.
It's like there's something about big companies buying small competitors and offing them just for the sake of getting them out of the way, wierd innit?

I'm happy AMD gets to reap fine as hell profits for another 4-6 months, they've earned it for playing the game nicely in the face of infinitely richer and resourceful opposition with scumbag tactics and anti-competitive business behaviour from all over.

I thought Nv would have learnt something from the 200 series before going onto this. Or was it the jump to 40nm, or was it biting off more than they could ever dream of chewing? When the boss says it would be cake to add DX11 to current G200 parts, in the face of a catastrophically delayed DX10.1 part with new architecture and doesn't even bring up Femri in that same interview, I'm thinking of a disillusioned Ken Kutaragi.
Obvious troll is obvious.
 
They missed the ball on getting "fermi build kits" out to AIBs? that seems kind of vague, You mean NV had chips but they didn't have PCBs and diagrams setup to pass along, or they didn't have the required chips? I mean there is nothing NV can do if they dont have the chips, but instead of saying that (if thats the case) you just say they dropped the ball on shipping out a build kit, which just makes NV look bad when there is a lack of information on this statement. If they Didn't get the PCBs made and the power diagrams for the different AIBs to have to build this card, that would be on NV and that would be inexcusable, seeing as they have had all this time to do so, but if they cant get the chips from TSMC, i don't see what if anything NV can do a bout that.

I think some of this distain for NV is unfounded, especially since they are sitting on their hands until TSMC can produce the silicon that NV orders up, then if anything needs to be changed which was apparent, NV has to send the revisions to TSMC and again, NV cant do anything about it after the order has been sent in. Its clear the fermi silicon wasn't right after first revision, but nvidia doesn't engineer these things, they only design them and do QA on them, then when they got proper silicon, make a reference design on the product, so about half of the time that it takes to do all of this isn't on NV, which people are getting confused about Its clear ATi sent in their silicon designs first, but even then TSMC is really dropping the ball on production, which is evident, but somehow people still hate on NV for this, like someone showed up to work hung over or something.
 
I don't think that has anything to do with it. I finally got tired of wating, I ordered a 5850 last night.
 
They missed the ball on getting "fermi build kits" out to AIBs? that seems kind of vague, You mean NV had chips but they didn't have PCBs and diagrams setup to pass along, or they didn't have the required chips? I mean there is nothing NV can do if they dont have the chips, but instead of saying that (if thats the case) you just say they dropped the ball on shipping out a build kit, which just makes NV look bad when there is a lack of information on this statement. If they Didn't get the PCBs made and the power diagrams for the different AIBs to have to build this card, that would be on NV and that would be inexcusable, seeing as they have had all this time to do so, but if they cant get the chips from TSMC, i don't see what if anything NV can do a bout that.

I think some of this distain for NV is unfounded, especially since they are sitting on their hands until TSMC can produce the silicon that NV orders up, then if anything needs to be changed which was apparent, NV has to send the revisions to TSMC and again, NV cant do anything about it after the order has been sent in. Its clear the fermi silicon wasn't right after first revision, but nvidia doesn't engineer these things, they only design them and do QA on them, then when they got proper silicon, make a reference design on the product, so about half of the time that it takes to do all of this isn't on NV, which people are getting confused about Its clear ATi sent in their silicon designs first, but even then TSMC is really dropping the ball on production, which is evident, but somehow people still hate on NV for this, like someone showed up to work hung over or something.
Build kit is PCB + components + silicon/chip. It is the complete unsolidered parts to make a reference card IIRC.
 
Not the best position for nVidia to be in obviously missing the holiday season but at the same time given the state of the economy it you're were going to be late now would be one of the better times to be so.

For me personally this is going to work out as I wasn't planing my new rig untilthe new 32nm Intel parts are released which should be around the April 2070 time frame.

But this had better be a HELL of a part.
 
Not the best position for nVidia to be in obviously missing the holiday season but at the same time given the state of the economy it you're were going to be late now would be one of the better times to be so.

For me personally this is going to work out as I wasn't planing my new rig untilthe new 32nm Intel parts are released which should be around the April 2070 time frame.

But this had better be a HELL of a part.

Argg...no edit!

I hope the April 2070 date isn't prophetic!!
 
ARE THE NEW NVIDIA CARDS GOING TO SUPPORT MULTIPLE MONITORS LIKE ATI AND EYEFINITY?

If not, they're still going to be way behind....
 
ARE THE NEW NVIDIA CARDS GOING TO SUPPORT MULTIPLE MONITORS LIKE ATI AND EYEFINITY?

If not, they're still going to be way behind....

1) Stop shouting.
2) While it's a neat technology and one I'll probably use some day, the adoption rate is very low for technology like that.
3) Are we going to call ATI behind because they don't have the 3D stuff?
 
http://www.vizworld.com/2009/11/nvidias-fermi-sc09/



The troubles TSMC is having ramping up 40nm production is one thing slowing down Fermi cards' production. It's the same fab that makes atis current 40nm GPU's, which is why they are in somewhat short supply. Cmon' you tiawanese semicoductor manufacturing company, get your chips together already.

6.2x faster is nothing to sneeze at. Time will tell how that translates to game performance.

Question is, will double precision matter for gaming...

I think it won't for a very long time, just my opinion
 
I doubt this is all TSMC problem.

IIRC, GF isn't even producing the 40nm chip right now, which probably means they have the same problem.

if TSMC are the one that needs to be blame, then ATI/nVidia already switching to others.

Most likely all other silicon manufacture are having the same problems.

actually I am sure most of it is, If it was just AMD that would be one thing but Nvidia is also have issues with the process. I also suspect that both Nvidia and ATI will be going other places, at least not putting all their eggs in a single basket.
 
6.2x faster is nothing to sneeze at. Time will tell how that translates to game performance.

Problem is NV is missing their clock targets big time.,

The end result is a 6.2x increase in a DP simulation, not far off from previous expectations of 8x. This 8x figure is compared to the GTX 285, whereas the C1060 is based on the slower GTX 280, so we should've been expecting 9-10x improvements (768 vs. 78 Gflops). Even considering scaling inefficiencies, it is clear Nvidia have missed their previous expectations by missing their clock target as 6.2x is nowhere near 9-10x. Do remember that this is A2 silicon, and the final A3 silicon might be a tad faster.

http://vr-zone.com/articles/nvidia-fermi-demonstrated-at-sc-09/8071.html?doc=8071
 

The numbers are from tesla. VR-zone and BSON both made bad extrapolations of clock speeds based on the GFLOP numbers. BSON actually updated their article.

BSON said:
Update #2, November 18 2009 02:17AM GMT Following our article, we were contacted by Mr. Andy Keane, General Manager of Tesla Business and Mr. Andrew Humber, Senior PR Manager for Tesla products
BSON said:
ECC is enabled from both the GPU side and Memory side, there are significant performance penalties, hence the GFLOPS number is significantly lower than on Quadro / GeForce cards.

ECC will be disabled on GeForce cards and most likely on Quadro cards
BSON said:
Larger thermal exhaust than Quadro/GeForce to reduce the thermal load
Tesla cGPUs differ from GeForce with activated transistors that significantly increase the sustained performance, rather than burst mode.
 
nVidia is putting Tegras in the next Nintendo handheld. While I wouldn't go so far as to say they couldn't give two shits about the videocard market now, they certainly do not have anything to worry about for the moment.

nVidia is already spreading out thanks to pressure from AMD and Intel.

Oh, wait. I forgot.

DOooooooom! DOOOOOOOOOOM!
 
nVidia is putting Tegras in the next Nintendo handheld. While I wouldn't go so far as to say they couldn't give two shits about the videocard market now, they certainly do not have anything to worry about for the moment.

nVidia is already spreading out thanks to pressure from AMD and Intel.

Oh, wait. I forgot.

DOooooooom! DOOOOOOOOOOM!

Well that's the thing here, perspective.

Being [H]ard we tend to focus on the enthusiast level stuff, and that creates a tendency to over-estimate the significance of bad-news about higher end stuff, at least 99.99% of computer owners and buyers have no clue what's in their pc or laptop, and "fermi" means nothing to them. $200+ graphics cards are a trivial part of the overall market, and under $200, where the vast majority of the volume/revenue is, nvidia is doing just fine, as is ati, as is intel fer crissakes.

To us this stuff matters, because we're interested in it, but joe shmoe just wants a card that can run wow or the sims, and there are no problems for nvidia there.

The overall market is much more forgiving than news about late fermis would lead one to believe :p
 
ARE THE NEW NVIDIA CARDS GOING TO SUPPORT MULTIPLE MONITORS LIKE ATI AND EYEFINITY?

If not, they're still going to be way behind....

I would think they HAVE TO. What really are you going to do with all that GPU power in a gaming environment?
 
nVidia is putting Tegras in the next Nintendo handheld. While I wouldn't go so far as to say they couldn't give two shits about the videocard market now, they certainly do not have anything to worry about for the moment.

nVidia is already spreading out thanks to pressure from AMD and Intel.

Oh, wait. I forgot.

DOooooooom! DOOOOOOOOOOM!

Exactly.

This stuff only matters here. No one else in the world could give two shits and really, where are the games that even need this thing.

Eyefinity type stuff is nice and really the only way to use this much power but I don't have the moola for 3 monitors.
 
My AA math is a little fuzzy, but wouldn't your sample size be almost the size of the screen at that point?

Then do it SSAA rather than MSAA. :p

*

Tick tock goes the clock as the pendulum swings back and forth... NVIDIA has been down, ATI has been down. It's far too soon to start planning the funeral.
 
nVidia was on top and it was theirs to lose and they've done just that. What a shame to see this happen. If I was nVidia, I'd shore shit up, produce nothing but open reference designs, let the users tweak to their hearts galore, get their driver priorities straight and stop fucking around with the slick marketing hype they tried to generate with name/series switching. I knew that was going to come back and bite them in the ass and it's done just that.
 
I think they must be really pissed they are so far behind the 8 ball, or they just really missed what ATI could do. the 5970 Rocks! ATI didn't have to beat them by much, and we know ATI can sell them cheaper if they want to.

Nivida is hurting big time. I hope they survive.
 
This is getting ridiculous. My 260 is dying and I was hoping to hang on but at this rate I'm going to have to get a new card and it doesn't make much sense to intentionally buy 10.1 DX card on a gaming rig.

The retarded thing is I usually buy upper mid range (like a 260) by the time this thing comes out the lower grade models might not make it out until mid summer. By that time your looking at a few months before ATI releases the next model line. Good grief Nvidia get it together.
 
actually I am sure most of it is, If it was just AMD that would be one thing but Nvidia is also have issues with the process. I also suspect that both Nvidia and ATI will be going other places, at least not putting all their eggs in a single basket.

huh, you seems to be mistaken my words.

what I mean is even with GF, they will still have the same problem, then its really not TSMC's fault, its ATI/nVidia moving forward to fast in terms of size.
 
huh, you seems to be mistaken my words.

what I mean is even with GF, they will still have the same problem, then its really not TSMC's fault, its ATI/nVidia moving forward to fast in terms of size.

ah, I did misunderstand. and you may have something there, I still haven't heard a convincing reason why they didn't just do a die shrink on the GTX280. Clock that sucker up and plug the hole until Fermi. ATI chip isn't that big though (large yes but not huge). hope GF does better.
 
to the dooms day people: I would not be writing them out yet. I am sure this next cycle is going to be rough and I doubt the first gen is going to be anything like they hope but give them a chance to work out the bugs. even Intel fucked up on a couple of cycles. it was AMD kicking their ass that made them man up finally. I think they may yet do the same. just need to get rid of most of their brass

another reason why I’m afraid Nvidia will keep re-branding this cycle for a long time.

Nivida is hurting big time. I hope they survive.

are you being sarcastic? because I hope ATI will survive after Nvidia’s cards hit the market :D
 
another reason why I’m afraid Nvidia will keep re-branding this cycle for a long time.



are you being sarcastic? because I hope ATI will survive after Nvidia’s cards hit the market :D

by the time ATI will probably bash nVidia with new refresh..

then we should see some epic price war.. :D
 
Man I bet Nvidia never envisioned being in this position same time last year.

Who does? AMD/ATI finally competes and they get (so far as can be seen) the win. Now this should be a good as it will put Nvidia, and their fanbois, in their place a bit. This should also let the AMD/ATI fanbois learn that their favored engineering team will bend them over just as quickly as Nvidia.

At least two more cycles of AMD/ATI winning please. That ought to sap the Nvidia team's ego a bit, and make the AMD/ATI fanbois sore from the lack of lube.
 
Who does? AMD/ATI finally competes and they get (so far as can be seen) the win. Now this should be a good as it will put Nvidia, and their fanbois, in their place a bit. This should also let the AMD/ATI fanbois learn that their favored engineering team will bend them over just as quickly as Nvidia.

At least two more cycles of AMD/ATI winning please. That ought to sap the Nvidia team's ego a bit, and make the AMD/ATI fanbois sore from the lack of lube.

I am hoping that your wrong here, AMD has been pretty good at learning from Nvidia's mistakes (so it seems at any rate) and they haven't been greedy just yet (anyone thinking that they are not taking it on the noise right now with their lack of volume needs to go back to economics 101). they did launch the cards at pretty reasonable prices all things considering. what I am hoping for is when they do get the ramp up that they don't pull the "milk them for all they are worth". I hope they do remember the consumer backlash Nvidia saw.
 
I think some of this distain for NV is unfounded, especially since they are sitting on their hands until TSMC can produce the silicon that NV orders up, then if anything needs to be changed which was apparent, NV has to send the revisions to TSMC and again, NV cant do anything about it after the order has been sent in. Its clear the fermi silicon wasn't right after first revision, but nvidia doesn't engineer these things, they only design them and do QA on them, then when they got proper silicon, make a reference design on the product, so about half of the time that it takes to do all of this isn't on NV, which people are getting confused about Its clear ATi sent in their silicon designs first, but even then TSMC is really dropping the ball on production, which is evident, but somehow people still hate on NV for this, like someone showed up to work hung over or something.

Fabbing a 500mm^2 megachip isn't a trivial matter either. I seem to recall other tech companies slugging it out with megachip vs. scalable architectures... ah yes, AMD vs. Intel. A Kentsfield quadcore was comprised of two glued 143mm^2 Conroe chips while Barcelona was a single 285mm^2 chip. Suffice to say, Kentsfield was much easier to build than Barcelona.
 
another reason why I’m afraid Nvidia will keep re-branding this cycle for a long time.



are you being sarcastic? because I hope ATI will survive after Nvidia’s cards hit the market :D

for once I am wish they had in a 40nm 280GTX. at least something to hold them over until they were ready to go with Fermi. I have a feeling that this is going to take them a lot longer then they thought
 
They missed the ball on getting "fermi build kits" out to AIBs? that seems kind of vague, You mean NV had chips but they didn't have PCBs and diagrams setup to pass along, or they didn't have the required chips? I mean there is nothing NV can do if they dont have the chips, but instead of saying that (if thats the case) you just say they dropped the ball on shipping out a build kit, which just makes NV look bad when there is a lack of information on this statement. If they Didn't get the PCBs made and the power diagrams for the different AIBs to have to build this card, that would be on NV and that would be inexcusable, seeing as they have had all this time to do so, but if they cant get the chips from TSMC, i don't see what if anything NV can do a bout that.

I think some of this distain for NV is unfounded, especially since they are sitting on their hands until TSMC can produce the silicon that NV orders up, then if anything needs to be changed which was apparent, NV has to send the revisions to TSMC and again, NV cant do anything about it after the order has been sent in. Its clear the fermi silicon wasn't right after first revision, but nvidia doesn't engineer these things, they only design them and do QA on them, then when they got proper silicon, make a reference design on the product, so about half of the time that it takes to do all of this isn't on NV, which people are getting confused about Its clear ATi sent in their silicon designs first, but even then TSMC is really dropping the ball on production, which is evident, but somehow people still hate on NV for this, like someone showed up to work hung over or something.
My impression was that they didn't have the kits ready or rather the bugs worked out. (though this is surely related to your point?) if it was a simple matter of parts then they could have pointed the finger straight at TSMC

your quite right in that Nvidia does not deserve all the disdain for being late with Fermi (other things yes) but it really didn't help their cause running off with the mouth they way they did. before you tell you competitors that you are going to open up a can of woopass you should check your pantry and make sure you have some. that along with some of their shady business tactics have really added to the consumer backlash on them. I hope they take all this to hart and change some things.
 
I'm glad that I'll only be building my new system early next year

I'm seriously thinking of using my GF9800GTX with the new build if things doesn't improve by then. Its going to suck but I don't have unlimited cash lying around
 
So, what do we know about fermi?
It is HUUUUGE. And it is expensive to produce (3 re-runs).At +75% size of ATI's chip, i expect it to cost quiet a bit more than the ATI-card.(With margin,i expect about 700-800$)so, actually, i see no price competition, whatsoever.Even if it were available, right here,right now.
I am not even thinking about dual-fermi cards here. :eek:
My guess is, the first few batches of fermi,will be sold to HPC-clusters.before they introduce the desktop version.(there's a lot more margin to make, in the Tesla-business).
And since Glo-Fo is founded by AMD,guess who will get the better contract?It surely isn't nice, to be in NVIDIA's position atm.
 
I am hoping that your wrong here, AMD has been pretty good at learning from Nvidia's mistakes (so it seems at any rate) and they haven't been greedy just yet (anyone thinking that they are not taking it on the noise right now with their lack of volume needs to go back to economics 101). they did launch the cards at pretty reasonable prices all things considering. what I am hoping for is when they do get the ramp up that they don't pull the "milk them for all they are worth". I hope they do remember the consumer backlash Nvidia saw.

AMD is a company that is in business to make money. If Nvidia comes out with a "dog" (this is relative to performance obviously) then AMD will do the same thing they did to Intel when the Athlon 64 was competing against the Pentium 4 C/D: Jack up the price. Part and parcel of the industry mate. You see when your technology is second rate (i.e. not top performance holder), you drop the price to compete (as AMD has done since Core 2 came out) and when you are on top you charge through the nose for it.

Definitely a fanboi if you don't remember that. Or when ATI was last a mile ahead and abused it's position.
 
I regret that I didn't buy the HD5870 when they were released, not only that it was cheaper but the stock is better than now. I bet that people who jumped early on the HD5800 bandwagon will get a great value from their card just like people who bought the 8800GTX early.
 
Back
Top