Intel kills of consumer version of larrabee

Jeez. There is so many hatred Intel, here.

IMO, it is disappointing that Intel's graphics are so far behind schedule... but people who don't think Intel can pull off graphics are fooling themselves.

Why on earth would you think the company behind Pentium, Conroe, and i7 wouldn't be able to pull off massive parallel processing? Intel has massive potential.

My only hope is they don't lose focus like NVIDIA. It seems like Tegra, Ion, Tesla, desktop, and laptop graphics were too much for them to handle. But if Anand's not worried, neither am I. :)


They failed to deliver no hate here. Anyways this was not the first time Intel tried to get into the discrete graphics market. They failed the first time during the 90s with a product I can't remember the name.
 
The reason is obvious, x86 stuff is their own stuff, they are the rule maker, they can mess around however they like, compilers are optimized for their CPUs. with the GPU, that is where the real tech counts, thats where they suck. Intel fails in any open competition, including the 64bit cpu instruction set. :D

AMD sells x86 CPUs, so how come they have been inferior to Intel for the past several years? It's not about owning all the rights and patents because AMD was making better products for a while back in 2005 (Intel had a bad design with the Pentium 4). But when they fixed the design (really just reverting back to the Pentium 3 design), their products consistently blow away AMDs. It's the process technology. No other company can touch Intel's process technology. The process technology is what allows them to produce a 6 core CPU with 24 virtual cores. If NVIDIA gave all their GPU designs to Intel, Intel could produce the same products that NVIDIA makes, but they could shrink the process and make NVIDIA's products run much faster.
 
I was looking forward to Intel launching a decent graphics card just so that there would be more competition. It kinda got this feeling Nvidia and ATI are playing around with us a bit.
 
That's like saying "Lamborghini gave us awesome super cars, they must be experts at building pickup trucks!" Different tech, different concepts, different goals.

Actually, Lamborghini started as a tractor manufacturer. They did succeed in moving to sports cars. So the question remains: why can't Intel produce a competitive GPU?
 
Who really wanted a first-gen Intel video driver anyway?

I'm sure they can produce a competitive GPU, given enough resource investments to counter their many years of slipping behind ATI/nVidia but I just don't see THAT happening.
 
You know, if they can produce even a half-way decent mid-range card, that's most of the user base they just won.

Oh well. Keep plugging away Intel.
 
Intel never really adopted a strong GPU stance.

They seemed to have a "fight" attitude with SSE and MMX. "Oh - don't buy a GPU, we will just add floating point extensions into the CPU so that you can do these accelerations in the CPU instead - but really badly in comparison."

A 3D gamer, probably would have befinited more from having a GPU on the motherboard (along with upgradeable RAM slots) and an Integer CPU and drive controllers as on add-in daughtercards.

Even today - I'd say that if your videocard is drawing more wattage than your motherboard, it still wouldn't entirely be a bad idea to have a Floating point core on the motherboard and Integer as an afterthought.

But - these things have a strange way of working out.

Edit: Adobe Flash 10.1 may underline this idea. Long have 2D browser accelerations been in the realm of "CPU only" But everyone and their dog knows that things are a lot faster in the GPU - if you have the right GPU that is. The main limitation has been the programmer, who must understand and grasp the idea of floating point programming (which is a heck of a lot more difficult than integer programming n = x + 1, goto n)

Its been this way for a while now, even to the point where videocards would underclock when not in "3D" mode to save power. It will probably have to be changed back to always on now that more apps and more programmers are figuring out that a pipeline GPU is faster than a Quad core CPU anyday, if you can get your head around not programming for integers.

There actually is nothing that you can't do in floating point. Although it does seem like quite the waste to use such a thing of beauty as a floating point GPU just to basic integer math.

That Intel is moving to a 32-core processor does mean that they are moving closer to the massively parallel architecture of many GPU's, but the GPU still has superiority ever since it was made pipeline programmable... BTW: One integer for every 8-bits of data width is actually a golden number. A 256-bit bandwidth bus fits very nicely into a 32 core processor. Have a 256+32 width, and a CPU with 32 + 4 core, and you magically have a redundant bit ECC CPU, which has never been made before - but only if programmable that way....

Onboard audio - if the motherboard were floating point, would have a perfect sine wave reproduction - which would definitely be superior to CD-quality sound... But I'm rambling...
 
Last edited:
What the hell are some of you on? Intel own the graphics market too. They don't have a "gaming" or "high performance" chip, that's true, doesn't mean they don't know graphics. In fact, they know it very well, since no other company comes close to competing with their volume. That said, it seems like Larrabee was just being developed as Larrabee, and application for the platform, while some ideas were floated, would be ironed out later. Looks like consumer graphics didn't make the cut or wasn't worth the effort, so they dropped it for now. I'm wondering what's going to become of some of their graphical ventures like Project Offset.
 
What the hell are some of you on? Intel own the graphics market too. They don't have a "gaming" or "high performance" chip, that's true, doesn't mean they don't know graphics. In fact, they know it very well, since no other company comes close to competing with their volume. That said, it seems like Larrabee was just being developed as Larrabee, and application for the platform, while some ideas were floated, would be ironed out later. Looks like consumer graphics didn't make the cut or wasn't worth the effort, so they dropped it for now. I'm wondering what's going to become of some of their graphical ventures like Project Offset.

That is due to Intel marketing. Their IG's performance is not even close to AMD's or Nvidia's offerings. Their CPU engineers haven't got a clue how to do a GPU that will compete. Hence Larrabee got canceled. It is underperforming badly.
 
I find it unlikely that this delay is due to yield issues, intel is very fast at ramping up new products and processes.

Intel has a boatload of cash, and there are a ton of extremely bright EE and CS people out there. I have no doubt that intel has enough talent (or could hire enough talent) to get a serious contender into the GPU market within ~2 years of launching such a project. Its not as if no-one knows the general design principles behind both ATI and nVidia's products, and I can't imagine it would take Intel more than a month to CMP through one of those chips, take a bunch of SEM images, and completely reverse engineer it if they were really clueless on how to start such a design.

So, IF Larabee fails it is likely the project was mismanaged or given the wrong direction from the beginning.
 
That is due to Intel marketing. Their IG's performance is not even close to AMD's or Nvidia's offerings. Their CPU engineers haven't got a clue how to do a GPU that will compete. Hence Larrabee got canceled. It is underperforming badly.
Where? Most recent reports have actual GFLOP output already at or near the predicted actual performance of Fermi (that really hasn't been ironed out though). Also, since when has any other company taken Intel's approach to a "GPGPU"? Should the current test parts get refined or make it to market at these performance levels, it looks like they could take the scientific computing world by storm; that kind of performance that is that easy to code for? Yes please.
I find it unlikely that this delay is due to yield issues, intel is very fast at ramping up new products and processes.

Intel has a boatload of cash, and there are a ton of extremely bright EE and CS people out there. I have no doubt that intel has enough talent (or could hire enough talent) to get a serious contender into the GPU market within ~2 years of launching such a project. Its not as if no-one knows the general design principles behind both ATI and nVidia's products, and I can't imagine it would take Intel more than a month to CMP through one of those chips, take a bunch of SEM images, and completely reverse engineer it if they were really clueless on how to start such a design.

So, IF Larabee fails it is likely the project was mismanaged or given the wrong direction from the beginning.
There's always some people who think they're smarter than a teams of some pretty damn brilliant engineers. I agree, Intel is a powerhouse and to be honest, I think Larrabee is more of an experiment and they're seeing where it will go. If it looks like GPGPU is it's strongest performance arena, why not release there first and then further refine it for graphics processing? Despite how meticulously the enthusiast community has been following Larrabee (simply due to its implications or expectations, for that matter), I think it's going to be a bit before the rest of the world catches up, and then Larrabee, wherever it is, will matter.
 
I think Intel bet the farm on Itanium, but nobody wanted a painful 32-bit -> 64-bit switch.

To "bet the farm" means that if your bet fails, you lose everything you have. If Intel had "bet the farm" on Itanium, they would have been bankrupt now.
 
I think that if the world ever found out they didn't actually need integer calculations - that Intel and AMD would be "up the creek".

Most of their intellectual property patents are based on integers. If they went with floating calcs, they would have to pay I'm not sure? Sun? or maybe even formerly ATi for use of their patents.

If I were a conspiracist, I'd say that Intel intentionally released MMX and SSE just to keep a few programmers latched to the "intel instruction set/programming" for as long as they could.

Buying ATi still was probably a smart move. They paid too much - but IMO they have little choice going into the future.
 
GPU designers may not have been necessarily smarter than the Intel designers...

Intel designers were probably told specificially not to create too much infrastructure outside of the x86 or itanium integer design(s) Thats my take.

If Intel actually created a "good" GPU design, it would probably look too much like a current GPU design - and then they would get the pants sued off of them.
 
Where? Most recent reports have actual GFLOP output already at or near the predicted actual performance of Fermi (that really hasn't been ironed out though). Also, since when has any other company taken Intel's approach to a "GPGPU"? Should the current test parts get refined or make it to market at these performance levels, it looks like they could take the scientific computing world by storm; that kind of performance that is that easy to code for? Yes please.

There's always some people who think they're smarter than a teams of some pretty damn brilliant engineers. I agree, Intel is a powerhouse and to be honest, I think Larrabee is more of an experiment and they're seeing where it will go. If it looks like GPGPU is it's strongest performance arena, why not release there first and then further refine it for graphics processing? Despite how meticulously the enthusiast community has been following Larrabee (simply due to its implications or expectations, for that matter), I think it's going to be a bit before the rest of the world catches up, and then Larrabee, wherever it is, will matter.

Who cares about the GFLOP performance when we are talking about running PC games?
- Larrabee was not intended to compete with Tesla :p

Larrabee is a flop. An under-performer just like the PressHot P4 when they canceled the entire program. Of course, good things will come of it, as HT came out of NetBust. Intel mightyet get competitive IG some day out of Larrabeast.

Who cares how brilliant their engineers are? The are only CPU oriented - they have no clue how to do a GPU.

If you want proof, Intel canceled their own project after making a huge PR splash and prediction, buying up companies and IP, sinking mass dollars and resources into it .. and then after being 2 years late on their own roadmap. They simply gave up on it as a consumer project. It is now a SW dev kit
 
First there is too much competition from Nvidia and ATI. Secondly Intel needs to write and update drivers for their product regularly. How often do they update the drivers for their integrated graphics? Never! Seriously imagine if they updated their crappy integrated graphics driver you may be able to actually play a few games on it. If you have a discrete larrabee card. Would mean Intel would have to update the drivers every few months to accommodate new games, and glitches, bugs ect.....

Something went wrong. They don;t need TSMC they got their own fabrication plant. They would be pushing competitors out of the market and being more monopolistic. I think Larrabee will become vaporware. Or only used for certain cloud computing applications by scientists. You won;t be gaming on Larrabee, ever! I hear it does nto even use DirectX it renders graphics In a different way.
 
I think that if the world ever found out they didn't actually need integer calculations - that Intel and AMD would be "up the creek".

Most of their intellectual property patents are based on integers. If they went with floating calcs, they would have to pay I'm not sure? Sun? or maybe even formerly ATi for use of their patents.

If I were a conspiracist, I'd say that Intel intentionally released MMX and SSE just to keep a few programmers latched to the "intel instruction set/programming" for as long as they could.

Buying ATi still was probably a smart move. They paid too much - but IMO they have little choice going into the future.

But the world *DOES* need integer calculations. Floating point cannot be a substitute to integer work. If an OS tried it would regularly crash and make Windows 95 look rock solid due to a myriad of issues, ranging from rounding errors to dealing with NaN and other fringe cases. Also, GPUs are managed by an outside controller, CPUs have to manage themselves. Even if Nvidia and ATI added wicked integer performance to their next card, you simply could not at all use it as a CPU - it just doesn't work. Look at micro-controllers and other embedded processors like ARM - most of them don't have an FP unit at all and any FP work that is needed is emulated.

Intel and AMD also bring advanced features like the NX bit and hardware virtualization. CPUs are packed with features, GPUs aren't.

I don't think it had anything to do with 32 bit - > 64bit, it had to do with x86(and x86-64) -> IA64

The only reason to move to IA64 was for 64-bit. Even if the instruction set is better, that doesn't really matter. Software development is so far abstracted from the instruction level and compiler tech is much better at optimizing that it doesn't really matter anymore what you are running on.
 
Who cares about the GFLOP performance when we are talking about running PC games?
- Larrabee was not intended to compete with Tesla :p

Larrabee is a flop. An under-performer just like the PressHot P4 when they canceled the entire program. Of course, good things will come of it, as HT came out of NetBust. Intel mightyet get competitive IG some day out of Larrabeast.

Who cares how brilliant their engineers are? The are only CPU oriented - they have no clue how to do a GPU.

If you want proof, Intel canceled their own project after making a huge PR splash and prediction, buying up companies and IP, sinking mass dollars and resources into it .. and then after being 2 years late on their own roadmap. They simply gave up on it as a consumer project. It is now a SW dev kit
No offense, but you sound just like any other typical "gamer head enthusiast" OMG WTF WUTZ TEH 3DMARK SCORE!? How in the world do you consider this a flop:
http://www.brightsideofnews.com/new...-1tflops---27x-faster-than-nvidia-gt200!.aspx

There's plenty of excellent engineering there and it'll have many great uses if it's optimized correctly, we will see. So Larrabee doesn't run current gaming software as well as NVIDIA's and AMD/ATI's architectures, big whoop. Larrabee might give Intel a huge leg up if and when ray-tracing hits the market and for now they have quite a powerful product that could do a lot for them. Kudos to Intel for trying a completely new approach and architecture - the fact that they got an end product that is still powerful is very impressive. Is the "gaming on Larrabee" dream a flop? Looks like it for now. Is Larrabee a flop? Not at all, it's successful and could be a major win for Intel.
 
No offense, but you sound just like any other typical "gamer head enthusiast" OMG WTF WUTZ TEH 3DMARK SCORE!? How in the world do you consider this a flop:
http://www.brightsideofnews.com/new...-1tflops---27x-faster-than-nvidia-gt200!.aspx

There's plenty of excellent engineering there and it'll have many great uses if it's optimized correctly, we will see. So Larrabee doesn't run current gaming software as well as NVIDIA's and AMD/ATI's architectures, big whoop. Larrabee might give Intel a huge leg up if and when ray-tracing hits the market and for now they have quite a powerful product that could do a lot for them. Kudos to Intel for trying a completely new approach and architecture - the fact that they got an end product that is still powerful is very impressive. Is the "gaming on Larrabee" dream a flop? Looks like it for now. Is Larrabee a flop? Not at all, it's successful and could be a major win for Intel.

Why do i consider it a flop ?

Well, for starters, Intel considers it a flop :p
They *canceled* it. :rolleyes:
 
Why do i consider it a flop ?

Well, for starters, Intel considers it a flop :p
They *canceled* it. :rolleyes:
No they haven't. Show me an article where they called/considered it a "flop." Flop indicates that something is a complete failure. That isn't the case here, there was a lot learned and there still might be some viable products coming down the line.
 
No they haven't. Show me an article where they called/considered it a "flop." Flop indicates that something is a complete failure. That isn't the case here, there was a lot learned and there still might be some viable products coming down the line.

So what does a cancellation indicate to you?
- Success? :p

Larrabee was not canceled after many years of development and great expenditure of resources because it was successful
- try looking up "cancel" in an online dictionary and see if there is anything you might yet learn about forthcoming Larrabee products.
 
Well, for starters, Intel considers it a flop
They *canceled* it.
They haven't really. The technology in Larrabee will still exist as part of other programs, especially Intel's next iteration of an x86 GPU in the future.

So what does a cancellation indicate to you?
- Success?
So...you're saying that if something is not a complete failure it has to be a success?
 
No they haven't. Show me an article where they called/considered it a "flop." Flop indicates that something is a complete failure. That isn't the case here, there was a lot learned and there still might be some viable products coming down the line.

flop = complete failure

Intel communicated to its owners/shareholders that they will have a product. Not a give-away item, but something that will bring wealth to the company and its owners. Intel subsequently canceled that product. It will not be bringing any additional revenues or profits. It is a compete failure as a commercial project.

Intel sank over three billion dollars [estimate, grand number will probably never be known] into the project...
 
Last edited:
So what does a cancellation indicate to you?
- Success :p
So you have no proof, and would rather throw around buzz words and write-off complex projects in simpleton terms. Excellent.
 
Sure, Intel wants to skip it alltogether and move to vector 3D. Best guess as to why? Noone has a strong patent or really figured out an effective way to bring it to market yet.

Integer math is actually all doable in float. I never really understood it myself, but is mainly to do with Calculus which is difficult to grasp. Sine minus sine div cotangent... Whatever.

One step further than floating point, is fractal point. Of which Integer math again is also possible. It comes from comparing two or three fractals in a grid. Still only requires a few calculations, and can actually be more accurate than floating point (32 or 64 bit) with less overhead (fewer transistors firing) Might be especially useful in image compression someday.

Integer math is merely comparing one number to another and their relationship to get another number. Fractal math is more of comparing one image to another to get a new image (sort of like massively parallel integer math, you get the whole answer in one pass instead of requiring a billion little passes)
 
Why do i consider it a flop ?

Well, for starters, Intel considers it a flop :p
They *canceled* it. :rolleyes:

You also call ~1 TF of performance for underperforming...you say a lot of things...not all make sense.
 
flop = complete failure

Intel communicated to its owners/shareholders that they will have a product. Not a give-away item, but something that will bring wealth to the company and its owners. Intel subsequently canceled that product. It will not be bringing any additional revenues or profits. It is a compete failure as a commercial project.

Intel sank over three billion dollars [estimate, grand number will probably never be known] into the project...

You talk like the burried the whole Larrabee thing...danm this thread is full of ignorance:
Intel Cancels Larrabee Retail Products, Larrabee Project Lives On

Read the headline :rolleyes:
 
They will be back when it makes sense.

I also sense that this does NOT bode well for NVIDIA. Sure, they aren't engineering the exact same thing, but they are strongly banking on GPU compute. I believe it's too early for that to be thrown on the consumer market; early development, maybe.

I am really curious to see what NVIDIA rolls out next and if they keep their current positioning. My take is they will change their tune.
 
So you have no proof, and would rather throw around buzz words and write-off complex projects in simpleton terms. Excellent.

There is no proof that i could show an Intel fan. Their minds are already closed.

Everyone else understands "cancellation" and failure.

You guys need to learn to read more then the headlines. From Ryan Smith's blog:
The first Larrabee chip (which for lack of an official name, we’re going to be calling Larrabee Prime) will be used for the R&D of future Larrabee chips in the form of development kits for internal and external use.
So Intel spent hundreds of millions of dollars, Larrabee is 2 years late and they have zero to show but a *future* dev kit; that is *maybe* next year
:p
 
Last edited:
Lots of stupidity in this thread.

Intel has *working* Larabee silicon. I've seen a working board myself at a 3rd party developer site.
So if its working, why wont they release it? Simple: Its not competitive.
Intel knows exactly how fast ATI's and NV's stuff is, and they've decided not to release a losing product.
This is great news for both AMD and nvidia. More so for nvidia on the HPC side.

There is simply no way to spin this favorably for Intel.
 
You also call ~1 TF of performance for underperforming...you say a lot of things...not all make sense.

2900XT. Awesome specs, sucked ass.

Larrabee getting 1 TF doesn't mean a damn thing when it comes to gaming performance, ESPECIALLY given its architecture. Intel would be emulating in software what ATI and Nvidia do in hardware. Its an interesting idea, but its going to be slower. Which means Intel has to write not just a driver (which in and of itself is a massive undertaking that Intel has no real experience with) - and we all know drivers can easily make or break a card release - but they also have to write a high performance DX emulation layer for the card itself.

As far as I'm concerned even a glxgears benchmark is better way to judge Larrabee's performance than TFLOPs.
 
2900XT. Awesome specs, sucked ass.

Larrabee getting 1 TF doesn't mean a damn thing when it comes to gaming performance, ESPECIALLY given its architecture. Intel would be emulating in software what ATI and Nvidia do in hardware. Its an interesting idea, but its going to be slower. Which means Intel has to write not just a driver (which in and of itself is a massive undertaking that Intel has no real experience with) - and we all know drivers can easily make or break a card release - but they also have to write a high performance DX emulation layer for the card itself.

As far as I'm concerned even a glxgears benchmark is better way to judge Larrabee's performance than TFLOPs.

We are not talking theorethical AMD/NVIDIA PR performance here, lets compare the numbers in SGEMM:

http://www.lockergnome.com/theoracle/2009/12/05/what-is-intel-doing/
1. Intel Larrabee [LRB, 45nm] - 1006 GFLOPS
2. EVGA GeForce GTX 285 FTW - 425 GFLOPS
3. nVidia Tesla C1060 [GT200, 65nm] - 370 GFLOPS
4. AMD FireStream 9270 [RV770, 55nm] - 300 GFLOPS
5. IBM PowerXCell 8i [Cell, 65nm] - 164 GFLOPS

What was that?
Intel suddenly layning the smack down on GPGPU....in their first go.

No wonder NVIDIA wan't Fermi out...before Intel sits itself on the HPC market.
AMD, so far, are the only one not playing the game...unless you think a single R800 GPU can do the same?
(Which by AMD PR is a +3 TeraFlop card )
 
So a hand-tuned SGEMM kernel written for Larrabee, on an overclocked (read: they had to jack up the frequency) part, beats a year old video card running non-tuned SGEMM? Color me impressed. Someone wrote a SGEMM kernel that was doing ~800 gflops on a 48xx at stock. Imagine what a similarly hand-tuned SGEMM kernel would do on a 58xx and you would see that Larrabee would not be competitive today even for GPGPU.
 
What was that?
Intel suddenly layning the smack down on GPGPU....in their first go.

WTF? You are completely delusional.
Intel certainly knows more about Larabee's perf than you, and they just cancelled the product you claim is laying the smack down.
 
Back
Top