Gelsinger says Larrabee will wipe the floor with CPGPU

gryoung

Weaksauce
Joined
May 29, 2008
Messages
69
Another INQ special...
http://www.theinquirer.net/gb/inquirer/news/2008/07/01/gelsinger-larabee-wipe-floor

For everyone who is saying NVidia is too arrogant a company and needs to be taken down a notch. NV has nothing on Intel in the arrogance department. Intel hasn't had any original tech ideas for a quite a while now that's why AMD caught them off guard with the Athlon CPUs, Intel's response...weld together two cores in the same package and call that cutting edge...at first it was a joke, but then they got really serious and out came Core 2 and it wasn't so funny anymore. The only reason why they got away with it is that AMD is it's own worst enemy when it comes to development schedules and literally dropped the banana peel in front of itself and slipped up a 1-2 year lead it had on Intel.

Now Intel's Laughabee(Larrabee) is going to be the GPU killer? by taking many(10's) of x86 cores and welding them together in an array and calling it a parallel supercomputer? Of course the business sheep out there will buy thousands of them for server/compute farms since they like "compatibility" and are willing to accept 30 year old x86 technology which has been builtup like the tower of Babel with extension after extension each generation.

I just hope ATI and Nvidia wake up and smell the coffee and start to get ready for what's coming, I think that Intel is very serious about Larrabee and taking down both companies.
A lot of people laughed at the idea of Intel regaining the lead on AMD when Athlon was the king, I don't hear much laughing now.
And the same people are laughing at the thought of Intel in the graphics arena, we'll see what happens in the 2010-2011 timeframe.

Personally I dislike Intel's corporate philosophy(not so much their products) of stifling innovation and not innovating until it's absolutely necessary. An Intel dominated graphics world just doesn't bear thinking about.
 
PR battle. Intel belittling nVidia and nVidia responding in kind.
 
Intel should stop the trash-talking until they can deliver with the goods.
 
Another INQ special...
http://www.theinquirer.net/gb/inquirer/news/2008/07/01/gelsinger-larabee-wipe-floor

Intel hasn't had any original tech ideas for a quite a while now that's why AMD caught them off guard with the Athlon CPUs, Intel's response...weld together two cores in the same package and call that cutting edge...

My fanboys, let me show you them.

In case you missed the train, the Core2's are awesome and there's nothing close to them in the desktop or enthusiast markets. I'm no intel fanboy, I had an Athlon XP, but give credit where credit is due and stop flaming.
 
Personally I dislike Intel's corporate philosophy(not so much their products) of stifling innovation and not innovating until it's absolutely necessary. An Intel dominated graphics world just doesn't bear thinking about.[/QUOTE said:
Well,I can't say I've been exactly thrilled with Nvidia's domination in graphics either.You could just as well be talking about them when it comes to stifling innovation,we've been treated to one tired rehash after another for a year and a half.And like Intel,they've been taken by surprise by their main competitor.
 
Right. Let's be honest...both companies act like total ****s sometime, and both never hesitate to throw crap on the consumer if they think they can get away with it.

Competition keeps these companies innovative and progressive, while keeping prices low. Look at what NV did when ATI cards weren't much of a competition. It caught up with them, but that may not always be the case if they drive large competitors off the market.

~S
 
The medium to high end graphics card market could do with a 3rd player to be honest, with only 2 the market is a bit too unstable.

Nvidia should also start making CPU's and that would give us a nice 3 way for both CPU's and GPUs and keep the tech moving forward at a decent pace.

I can't wait to see what Intel have to offer, I hope it rocks.
 
oh boy I'm going to have a tough time avoiding name calling in this one...

AMD caught Intel with the athlon 64 because A) the athlon 64 was the cumulation of several great ideas B) because Intel was concerned with its billion dollar sink hole IA-64 C) Intel decided that the best way to go in the consumer field was higher clock speeds on a lower efficiency per clock but more clocks. We'd all be singing a different tune about netburst if Intel had hit the 10GHz promised.

Then the exact reverse happened. AMD decided (in the same manor Intel decided about the plan named netburst) that they're new architecture was the bomb. They shrunk dies, they decreased timings and they increased clock speeds but made no serious attempts to change the architecture. Intel took a step back, looked at the Pentium 3's efficiency per clock and the Pentium 4's number of clocks and did the obvious: combine the two. Result was <2.0GHz Pentium 3 which showed K8 who's boss.

Firstly, and this point is made moot by my next paragraph but I'll bring it up anyways; very few people in this forum have the ability to say whats original in a CPU architecture because of its immense complexity. AMD is certainly better at marketing their ideas in consumer friendly packages, but that doesn't mean they have any more or less original ideas than Intel. When your processor has a big 64 written across it and the other guy's doesn't (and hell, if your even inquisitive enough to ask about it you find out his is only 32), its believed thats a new idea from AMD. It's not, it was just the natural evolution of X86 arguably at the time intel was right to stick with 32bit X86 because during the time of 512MB computers who wants more than 4GB of memory? If your talking about "original ideas" like the on-die memory controller and HT variant of the FSB, brought to you by AMD, then you are clearly not a person qualified to make a statement like "Intel hasn't had any original tech ideas for a quite a while now", because you clearly don't understand the intricacies of the hardware. Intel has had original ideas, some failed, some didn't. Could I list them for you? Hell no, I'd need about $600,000 and six years in front of some prof at MIT to do that, but I know they do because Conroe caught Toledo with its pants down.

That said, does it really matter how the thing is powered? Does it really matter if its innovative or not? If it goes like hell, and has a small silicon foot print, who cares who idea it was? In the computer hardware business its not who has the idea but who implements it the best. Having the idea simply means you get it in the first revision, the other guy gets it the one after next.

and in regards to the article, just another pile of PR BS.
 
oh boy I'm going to have a tough time avoiding name calling in this one...

AMD caught Intel with the athlon 64 because A) the athlon 64 was the cumulation of several great ideas B) because Intel was concerned with its billion dollar sink hole IA-64 C) Intel decided that the best way to go in the consumer field was higher clock speeds on a lower efficiency per clock but more clocks. We'd all be singing a different tune about netburst if Intel had hit the 10GHz promised.

Then the exact reverse happened. AMD decided (in the same manor Intel decided about the plan named netburst) that they're new architecture was the bomb. They shrunk dies, they decreased timings and they increased clock speeds but made no serious attempts to change the architecture. Intel took a step back, looked at the Pentium 3's efficiency per clock and the Pentium 4's number of clocks and did the obvious: combine the two. Result was <2.0GHz Pentium 3 which showed K8 who's boss.

Firstly, and this point is made moot by my next paragraph but I'll bring it up anyways; very few people in this forum have the ability to say whats original in a CPU architecture because of its immense complexity. AMD is certainly better at marketing their ideas in consumer friendly packages, but that doesn't mean they have any more or less original ideas than Intel. When your processor has a big 64 written across it and the other guy's doesn't (and hell, if your even inquisitive enough to ask about it you find out his is only 32), its believed thats a new idea from AMD. It's not, it was just the natural evolution of X86 arguably at the time intel was right to stick with 32bit X86 because during the time of 512MB computers who wants more than 4GB of memory? If your talking about "original ideas" like the on-die memory controller and HT variant of the FSB, brought to you by AMD, then you are clearly not a person qualified to make a statement like "Intel hasn't had any original tech ideas for a quite a while now", because you clearly don't understand the intricacies of the hardware. Intel has had original ideas, some failed, some didn't. Could I list them for you? Hell no, I'd need about $600,000 and six years in front of some prof at MIT to do that, but I know they do because Conroe caught Toledo with its pants down.

That said, does it really matter how the thing is powered? Does it really matter if its innovative or not? If it goes like hell, and has a small silicon foot print, who cares who idea it was? In the computer hardware business its not who has the idea but who implements it the best. Having the idea simply means you get it in the first revision, the other guy gets it the one after next.

and in regards to the article, just another pile of PR BS.

QFT!
 
One thing I chuckled at was Intel telling developers to get ready for "tens, hundreds or thousands" of cores. I know they meant x86 cores, but there's already been a demonstration of a 1024 core system (4 x 9800GX2) used for medical imaging last month and ATI will have a 4870X2 next month where a CF system will have 3200 cores (or 6400 cores using 4 slots, but probably not CF). :p
 
I'll be right here to see Larrabee trying to "wipe the floor with CPGPU". I highly doubt it though.
 
My fanboys, let me show you them.

In case you missed the train, the Core2's are awesome and there's nothing close to them in the desktop or enthusiast markets. I'm no intel fanboy, I had an Athlon XP, but give credit where credit is due and stop flaming.

I'm not trying to bash the Core2, as I said, I think AMD didn't take Intel's comeback efforts seriously and were caught off guard due to their getting a bit arrogant about the lead their Athlon tech gave them. Intel did a superb job of pulling themselves out of a hole when they realized they were caught off guard.

My point was that NV is behaving in much the same way Intel did before Athlon, Jen-Hsun Huang is treating Intel's interest in entering the GPU field as a joke, it could come back to haunt him.
Looking at the GT200 series before the latest barrage from ATI, everyone thought they had the high end sown up, now with the latest reviews combined with the price cuts from NV you would have to think that taking a competitor like ATI lightly is a painful, though not fatal mistake, they are going through a humbling experience at the moment. See here http://www.tgdaily.com/content/view/38237/135/

I for one hope the competition between ATI vs NV continues, with AMD/ATI working on the Fusion project I think Intel will have to respond in someway or risk being caught off guard again.

As another poster said, a 3-way competition would be good for the market, i for one think $600+ high end GPUs just aren't sustainable and ATI has gotten it's message out loud and clear(price/performance matters to the mainstream consumers) and will be rewarded for it.
 
I've heard Larabee won't actually be that good as a GPU. It will likely take its successor for it to be an attractive option, but I could be wrong.
 
Back
Top