Intel: Larrabee Graphics Chip Delayed

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to C|Net, Intel is delaying Larrabee because development is "behind where we hoped it would be." For now Larrabee will be used as a software development platform for internal and external use.

"Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesperson Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said.
 
Anyone expecting intel to catch up to a decade of graphics technology that is continually improving, in a few years, was in for a disappointment. intel needs to absorb nVidia to have a chance.
 
Anyone expecting intel to catch up to a decade of graphics technology that is continually improving, in a few years, was in for a disappointment. intel needs to absorb nVidia to have a chance.

Nvidia's stock is going to completely tank with their next quarterly report. They will be very vulnerable to a take over. I doubt they will swallow NV though as it is going to take out several years to digest NV and put Larrabee even further behind the release of Fusion.
 
Nvidia's stock is going to completely tank with their next quarterly report. They will be very vulnerable to a take over. I doubt they will swallow NV though as it is going to take out several years to digest NV and put Larrabee even further behind the release of Fusion.

True, but it would still give Intel an integrated solution that did not suck.
 
Wow! :eek:

I really expected with all the profits Intel has to play with that they would big something ubber cool.

The tech demo they did really looked bad though.

I glad they are failing at it. I hope they continue to fail. They are making great CPUs right now, but if they made a great video card, or on chip then it's game over for Nivida.

Ati may continue with AMD, if AMD can make something to shake up Intel.
 
And how exactly are you so sure of this?

Sure? I am not even sure how old I am most mornings. I am not sure of that at all.
My response was aimed at his alluding to the possibility of Intel buying Nv. (Which is not something I believe is going to happen anytime soon mind you.) However, in case they did buy Nv, they would have access to Nv's integrated graphics solutions. Which have been historically better when it comes to graphics than Intel's. Intel integrated sucks, period. Ati, and Nv integrated suck far less.
 
You're all kidding right?

For one thing Intel plays with release dates all the time.

For another thing Intel has nothing to lose by not releasing a dedicated graphics platform.

Further, Larrabee is not JUST a graphics platform, It's meant to push parallel computing which will eventually mean the end of discrete graphics all together.

Intel can wait and bide its time instead of rushing a half-baked product to market.

I'd bet that 10 years from now Nvidia has gone the way of VooDoo and ATI is a direct competitor product for product to Intel.

:cool:
 
Wow, this is a serious 180 from this article. Wow, that article and that author are jokes now.

yes and no, read http://www.brightsideofnews.com/new...-1tflops---27x-faster-than-nvidia-gt200!.aspx

the are going of SGEMM test (read math) rather then a graphics test. I am thinking that they are finding that card doesn't do graphics well hence the delay. but as it still competes (or rather creams) Nvidia tesla card its viable for other uses. just the consumer version is canceled. then again I am just guessing here.
 
You're all kidding right?

For one thing Intel plays with release dates all the time.

For another thing Intel has nothing to lose by not releasing a dedicated graphics platform.

Further, Larrabee is not JUST a graphics platform, It's meant to push parallel computing which will eventually mean the end of discrete graphics all together.

Intel can wait and bide its time instead of rushing a half-baked product to market.

I'd bet that 10 years from now Nvidia has gone the way of VooDoo and ATI is a direct competitor product for product to Intel.

:cool:

That might be true if it wasn't the fact that nVidia is still doing better financially than AMD/ATi. Ati has a solid win in the enthusiast market currently but it still isn't even near potentially paying off the amount it borrowed in past years. Nvidia is making an excellent profit in every other market they are in right now except physics which is difficult to quantify. AMD was about to go under just a couple years ago... you think they just waved a magic wand and it disappeared? I think ATi is putting out a great product but remove your blinders and see the fact that the enthusiast market makes up a small percentage of profits in the real world to these companies. ATi can win the battle and still lose the war. They need better processors that don't lag behind a year with its direct competitor.
 
I'd bet that 10 years from now Nvidia has gone the way of VooDoo and ATI is a direct competitor product for product to Intel.

I'm pretty sure 5 years ago someone made that same bet an lost.
 
Something's fishy here...

Intel announces Single chip cloud computer with up to 48 cores, now intel cancels larrabee which is also up to 48 cores...
 
Something's fishy here...

Intel announces Single chip cloud computer with up to 48 cores, now intel cancels larrabee which is also up to 48 cores...

GPU's are used for cloud computing due to the parallel processing design. Its a sign that they were having trouble coming up with stable drivers.
 
Gee, who woulds thought.

Jen @ nvidia :p

Seriously, who the heck thought Intel could compete in the graphics arena that's got an even higher turnover than CPU's? Intel bought the i740, and that was the only 3D card worth a damn from them, only to let it languish into Intel "Extreme" (extremely slow?) graphics. Intel doesn't see the profit of playing in that space. High costs, quick turnover, et al.
 
I'd bet that 10 years from now Nvidia has gone the way of VooDoo and ATI is a direct competitor product for product to Intel.

:cool:

I would be willing to take that bet. One thing I've learned over the years is that Nvidia has constantly been written off, but who's laughing now? Matrox, meh, they just gave up. ATI, they had to be bought out (or they probably would have "died" without the capital infusion - well at least the gaming portion), and we all know where 3dfx went. I won't go into Trident, et al.

Who's still producing "top" cards independently? Nvidia
Who's still trying to come up with new reasons other than just games for home GPU's? Nvidia
Who actually went out and tried to really end the chicken / egg scenario for off-loading physics processing? Nvidia

I don't have to like them, but I can respect their work in expanding gaming and GPU's in general. I think that's why ATI had to be bought out, because what the f*ck did they do for years? They even gave up on their AIW cards, my favorite cards from them...
 
Well, blame the intel fan boys for hyping this thing.
"Who kept saying Larrabee is going to pwn amd/ati and nvidia."

I LOL'd the first time and I'm LOL now.
 
I would be willing to take that bet. One thing I've learned over the years is that Nvidia has constantly been written off, but who's laughing now? Matrox, meh, they just gave up. ATI, they had to be bought out (or they probably would have "died" without the capital infusion - well at least the gaming portion), and we all know where 3dfx went. I won't go into Trident, et al.
What are you talking about? ATI was in no danger of going under. In fact it's the main business for AMD that's keeping them a float. ATI's problem was that they didn't have a decent chipset business meanwhile Nvida at the time did. That was ATI's main problem, not graphics.
Who's still producing "top" cards independently? Nvidia
What benchmark are you using? Have you seen Fermi benchmarks? Please share.
Who's still trying to come up with new reasons other than just games for home GPU's? Nvidia
I think what you meant to say was who has to come up with other reasons for home GPUs.
Who actually went out and tried to really end the chicken / egg scenario for off-loading physics processing? Nvidia
Huh Physx was it's own company that had it's own boards that did this long before Nvidia did.
 
I'd bet that 10 years from now Nvidia has gone the way of VooDoo and ATI is a direct competitor product for product to Intel.

:cool:

this is SOME THING I DO NOT WANT TO EVEN THINK ABOUT
unless you LIKE paying a grand for GPUs
 
Hey didn't Intel just get access to AMD/ATi's latest tech from losing that court case where they paid AMD/ATi 1.5 zillion dollars? So now, can they take what is best from AMD and add it to their own design?

and if what I'm guessing is right will this help for compatibility?
 
Hey didn't Intel just get access to AMD/ATi's latest tech from losing that court case where they paid AMD/ATi 1.5 zillion dollars? So now, can they take what is best from AMD and add it to their own design?

and if what I'm guessing is right will this help for compatibility?

Compatibility with what?
 
Something's fishy here...

Intel announces Single chip cloud computer with up to 48 cores, now intel cancels larrabee which is also up to 48 cores...

Larrabee used a synchronized cache design which made it extremely hard to scale the design. 24 cores would have been possible, 48 a definite stretch. It never would have been able to take off in terms of massive scaling. This new 48-core chip Intel has shown doesn't use synchronized cache memory, thus making it more like a networked matrix of cores. This design should scale well beyond 48 cores. It also uses a full IA, allowing regular x86 code to run on a single 'tile', unlike Larrabee which had a non-standard IA.

In a sense it's Larrabee's more practical-minded brother.
 
What are you talking about? ATI was in no danger of going under. In fact it's the main business for AMD that's keeping them a float. ATI's problem was that they didn't have a decent chipset business meanwhile Nvida at the time did. That was ATI's main problem, not graphics.

What benchmark are you using? Have you seen Fermi benchmarks? Please share.

I think what you meant to say was who has to come up with other reasons for home GPUs.

Huh Physx was it's own company that had it's own boards that did this long before Nvidia did.

1) ATI's chips, pardon the phrase, were utter sh*t for years. 2nd place in a two-horse race isn't good. You have to admit it's a 2 horse race in real terms, because nobody else has released anything significant for how long now? The big difference now for ATI is the cozier relationship with TSMC and the upcoming AMD (pardon me, Global Foundries) fabs. This may be part of the problem here. I'm going from the mid '90s to current and you are probably thinking of the last two gens. Prior to this last year, the only real winner ATI had (other than spending 6mil on getting a couple of frames extra in Source) was the 9700/9800, and those were roundly cracked upside the head by the 6 series. So let's see, 6, 7, 8, 9, and a good chunk of time the GTX cards (and in fact until the latest release of the new X2, the 295 was still king of the hill) were "winning" and ATI had to try and compete on price. There was a good reason their cards were cheaper, just like AMD CPU's are magically cheaper than Intel CPU's again. There was a reason they were for sale.

Now let's go back further, and realize that the 9700/9800 (until the massive re-branding of same chips, but different names campaign began) utterly smacked around the 5 series. Nvidia was, and rightfully so, panned by all but the fanbois. Seems like the ATI fanbois would like to re-write history a little.

2) 3dfx, bought out. ATI, bought out. Matrox, no longer in the market. Trident (that was a joke) not there. S3, not in the market and another joke really. Tell me, who's still producing top performing chips independently? This may have simply been you misunderstanding the word "independent" but I'm not sure.

3) I guess ATI is oh so desperate themselves then. I mean they're trying desperately to ensure that Folding@Home and SETI@Home run on their chips well (among other applications) and so on. The difference here is that, from the way it appears, Nvidia started on this back with the TNT when ATI was still churning out 3D( if you squint) RagePro chips.

4) I'm going to educate you a bit on what a chicken and egg problem is. 3D graphics, real 3D graphics, come along in the form of the 3dfx Voodoo passthrough card. Problem was, why buy it if there's no software that supports it? As a developer, why support it if only a very small segment has the hardware to run it? This is a basic chicken and egg problem, and yes, it can get more complex.

Now Ageia, a pretty smart little company, comes up with a software language AND hardware to increase the amount of physics calculations available to a game so developers can have more realistic items (cloth, collisions, fractures, et al) and even gets a couple of companies to release cards with their PhysX engine on it. Why? Well, to do those calculations on separate hardware frees up CPU cycles nicely. The Chicken and Egg scenario kicks in, however: Why buy it, if nothing supports it? Why support it if only a small segment have it? Enter Nvidia, operating of their own selfish motives, buys Ageia, and takes their programmable GPGPU (as all should be called from now on, even ATI graphics chips, if they support reprogrammability) and redoes PhysX putting TWIMTBP dollars behind it. Egg solved for developer. After all now there's a bunch of hardware in the wild, and it's only growing. Not only that, but you now have dollars (or more commonly, programmers writing hardware specific loops for your game, under NDA, for you to review) to include it.

Just a little clarification for you bucko.
 
1) ATI's chips, pardon the phrase, were utter sh*t for years. 2nd place in a two-horse race isn't good. You have to admit it's a 2 horse race in real terms, because nobody else has released anything significant for how long now? The big difference now for ATI is the cozier relationship with TSMC and the upcoming AMD (pardon me, Global Foundries) fabs. This may be part of the problem here. I'm going from the mid '90s to current and you are probably thinking of the last two gens. Prior to this last year, the only real winner ATI had (other than spending 6mil on getting a couple of frames extra in Source) was the 9700/9800, and those were roundly cracked upside the head by the 6 series. So let's see, 6, 7, 8, 9, and a good chunk of time the GTX cards (and in fact until the latest release of the new X2, the 295 was still king of the hill) were "winning" and ATI had to try and compete on price. There was a good reason their cards were cheaper, just like AMD CPU's are magically cheaper than Intel CPU's again. There was a reason they were for sale.

Now let's go back further, and realize that the 9700/9800 (until the massive re-branding of same chips, but different names campaign began) utterly smacked around the 5 series. Nvidia was, and rightfully so, panned by all but the fanbois. Seems like the ATI fanbois would like to re-write history a little.

2) 3dfx, bought out. ATI, bought out. Matrox, no longer in the market. Trident (that was a joke) not there. S3, not in the market and another joke really. Tell me, who's still producing top performing chips independently? This may have simply been you misunderstanding the word "independent" but I'm not sure.

3) I guess ATI is oh so desperate themselves then. I mean they're trying desperately to ensure that Folding@Home and SETI@Home run on their chips well (among other applications) and so on. The difference here is that, from the way it appears, Nvidia started on this back with the TNT when ATI was still churning out 3D( if you squint) RagePro chips.

4) I'm going to educate you a bit on what a chicken and egg problem is. 3D graphics, real 3D graphics, come along in the form of the 3dfx Voodoo passthrough card. Problem was, why buy it if there's no software that supports it? As a developer, why support it if only a very small segment has the hardware to run it? This is a basic chicken and egg problem, and yes, it can get more complex.

Now Ageia, a pretty smart little company, comes up with a software language AND hardware to increase the amount of physics calculations available to a game so developers can have more realistic items (cloth, collisions, fractures, et al) and even gets a couple of companies to release cards with their PhysX engine on it. Why? Well, to do those calculations on separate hardware frees up CPU cycles nicely. The Chicken and Egg scenario kicks in, however: Why buy it, if nothing supports it? Why support it if only a small segment have it? Enter Nvidia, operating of their own selfish motives, buys Ageia, and takes their programmable GPGPU (as all should be called from now on, even ATI graphics chips, if they support reprogrammability) and redoes PhysX putting TWIMTBP dollars behind it. Egg solved for developer. After all now there's a bunch of hardware in the wild, and it's only growing. Not only that, but you now have dollars (or more commonly, programmers writing hardware specific loops for your game, under NDA, for you to review) to include it.

Just a little clarification for you bucko.

You've got so many inaccurate statements in that sea of crap, while making new points that weren't in your original argument that I'll be here all day trying to unravel it all.

ATI was easily competitive with Nvidia all the way from 5 series until the 8 and 9 series. After that they've been easily competitive going forward. There's more than enough benchmarks around to prove this point.

http://techreport.com/articles.x/7679/4
http://techreport.com/articles.x/7679/6
http://techreport.com/articles.x/7679/8
http://techreport.com/articles.x/9310/6
http://techreport.com/articles.x/9310/7
http://www.hardocp.com/article/2006/01/24/ati_radeon_x1900_series_evaluatio/7
http://www.hardocp.com/article/2005/03/14/visiontek_xtasy_x850_xt_review/5

These benchmarks say you don't know what you are talking about. Nuff said on that one.

Second what difference does it make that it's (ATI) not independent? People don't care whether the company has merged, or is by itself. The company could be on planet buttwipe for all i care, but as long as they make decent products that's all that matters. We care about performance, apparently you care about what color the roses are. Great!

Your third point, and I'll say it again Nvidia has to come up with another use for their GPU's. ATI does not. Yes they have the feature but who here thinks that ATI has spent more time on their GPGPU functionality than Nvidia? ATI's latest slogan is "this is a card for gamers" does it sound like they care as much as Nvidia does on GPGPU performance? Nope.

Fourth, with all of that crap you've said it still doesn't change the fact that Nvidia was NOT the first to offload physics. You've been wrong since point one and your wrong here. I don't care if you like green eggs and ham, I will not eat them here or there and it doesn't change the fact that without Ageia you wouldn't be eating them anywhere.

My final point is that I primarily buy Nvidia (actually it's all I've bought) so I'm not some ATI fanboy, but I felt a need to respond because it looks like you've been smelling Nvidia's farts for an extended period of time, and during this inebriated state you've felt the need to share your thoughts with us. Thank you for sharing your opinions with me, but since most of what you typed has been incorrect I've learned nothing by your attempt in educating me. Thank you for wasting my time. :)
 
You've got so many inaccurate statements in that sea of crap, while making new points that weren't in your original argument that I'll be here all day trying to unravel it all.

ATI was easily competitive with Nvidia all the way from 5 series until the 8 and 9 series. After that they've been easily competitive going forward. There's more than enough benchmarks around to prove this point.

http://techreport.com/articles.x/7679/4
http://techreport.com/articles.x/7679/6
http://techreport.com/articles.x/7679/8
http://techreport.com/articles.x/9310/6
http://techreport.com/articles.x/9310/7
http://www.hardocp.com/article/2006/01/24/ati_radeon_x1900_series_evaluatio/7
http://www.hardocp.com/article/2005/03/14/visiontek_xtasy_x850_xt_review/5

These benchmarks say you don't know what you are talking about. Nuff said on that one.

Second what difference does it make that it's (ATI) not independent? People don't care whether the company has merged, or is by itself. The company could be on planet buttwipe for all i care, but as long as they make decent products that's all that matters. We care about performance, apparently you care about what color the roses are. Great!

Your third point, and I'll say it again Nvidia has to come up with another use for their GPU's. ATI does not. Yes they have the feature but who here thinks that ATI has spent more time on their GPGPU functionality than Nvidia? ATI's latest slogan is "this is a card for gamers" does it sound like they care as much as Nvidia does on GPGPU performance? Nope.

Fourth, with all of that crap you've said it still doesn't change the fact that Nvidia was NOT the first to offload physics. You've been wrong since point one and your wrong here. I don't care if you like green eggs and ham, I will not eat them here or there and it doesn't change the fact that without Ageia you wouldn't be eating them anywhere.

My final point is that I primarily buy Nvidia (actually it's all I've bought) so I'm not some ATI fanboy, but I felt a need to respond because it looks like you've been smelling Nvidia's farts for an extended period of time, and during this inebriated state you've felt the need to share your thoughts with us. Thank you for sharing your opinions with me, but since most of what you typed has been incorrect I've learned nothing by your attempt in educating me. Thank you for wasting my time. :)

You should read better...fact of the matter is that NVIDIA has had 2/3 of the market share for a long time...ever since the NV40 NVIDIA has had a good marketlead...because AMD(then ATi) didn't give user a good enough inctment to cross over.
Then there was the blunder formerly known as R600....took AMD copule of generations to get that right.

Now the focus is moving to GPGPU and there NVIDA has the lead...AMD is hardly present...thanks for wasting my time :)
 
Back
Top