Should Intel get back into the gpu game?

ShuttleLuv

Supreme [H]ardness
Joined
Apr 12, 2003
Messages
7,295
I say yes. As much as the i740 and subsequent releases were not on par speed wise, I personally enjoyed the cards. The i740 actually ran decent. Decent 2d, fast enough 3d, and at the time better image quality than 3dfx. The openGL ICD was a wrapper but it worked. D3d worked with minimal problems. Now, that was years ago. How would they fare now? Who knows. But it makes one think. Would you like a larrabee?
 
I mean say they preposed a viable solution with good specs. Would anyone buy it? I think it could get really interesting here. ;)
 
They'll have a bit of a hurdle to win over enthusiast who are used to only ATI and Nvidia, but if they put out a good competitive product people will buy it.
 
They'll have a bit of a hurdle to win over enthusiast who are used to only ATI and Nvidia, but if they put out a good competitive product people will buy it.


There is no bigger, more recognizable name in this world than intel in computers. More so than Ati or nvidia could ever dream. Intel has name recognition.
 
There is no bigger, more recognizable name in this world than intel in computers. More so than Ati or nvidia could ever dream. Intel has name recognition.

But Intel has history of piss poor GPUs. It'll take a lot for enthusiast to look past that.
 
IF Intel produced a card that benchmarked well, peeps at nvidia and ATI would have heart attack, and rightfully so. SHOULD they, I'll let the execs decide. It's not their core strength. They're the big dogs of the schoolyard, best fabs, etc........

I'll disagree with SicKlown. Heck, if intel just released the GTX 480, the headlines would be miles different.
 
They haven't because the add-on market is very small. I'd bet that 90% of computers don't have a dedicated video card. The return on developing a 3d card would probably be small compared to focusing on integrated solutions.

Personally, I say they are big enough. But, they did well in the ssd market.
 
If it'll mean more capable onboard GPU's it'll be great. Netbooks that actually have good power at a good price. A $300.00 N10J would be good reason to get a new netbook.
 
I'd like to see Larrabee. I'm hoping they'll revive it in a few years as a desktop GPU product since work is still being done on it as a compute product.
 
AMD has already absorbed ATI and the arrangement seems to be working out pretty nicely for them so far. Perhaps Intel should just buy Nvidia and use the marketing recognition and technical expertise for their own nefarious graphical purposes. The company was worth something like 6 billion in 2008, which is about the price of a new fab to Intel, a company that had revenues of 35 billion last year.
 
I believe AMD or Intel tried to buy nVidia before, but Hsu wanted a CEO or president position or some shit like that.
 
Well Nvidia's stock has been downgraded and it isn't looking too good for them. Intel could probably do a hostel take over if they wanted. Intel has already pushed them out of the chipset market and almost out of the discrete GPU market.

Personally I want Intel to develop better GPU's period, just for their IGPs first. The things need to be capable of playing back blu-ray video.

I would like to see Larrabee make a come back. A solid mid-range card would be enough to start out with to make money & gain market traction.

Hell if Intel could just develop a card to give hardware acceleration for their Havok physics engine. That alone would shake things up for the video game market and further knock down nVidia.
 
More competition is almost always a good thing.

Less of a chance for any of them to rest on their laurels...
 
I say yes. As much as the i740 and subsequent releases were not on par speed wise, I personally enjoyed the cards. The i740 actually ran decent. Decent 2d, fast enough 3d, and at the time better image quality than 3dfx. The openGL ICD was a wrapper but it worked. D3d worked with minimal problems. Now, that was years ago. How would they fare now? Who knows. But it makes one think. Would you like a larrabee?

Only if it offered competitive performance. Intel's Larrabee demos/prototypes have been WELL BEHIND the performance curve. Calling them abject failures would be putting it lightly. It would be real nice to have a 3rd competitor throw its hat into the ring to bring more competition to the playing field and hopefully help lower GPU prices in general. But it better be FAST and it better be cost efficient or it's just going to be a financial albatross for Intel.
 
If Intel bought Nvidia. Then Nvidia gpu's would be insane being made at intel fabs. Can you imagine how good fermi would be if it was made at intel fabs, it would be low power, low heat. But I doubt it will happen not sure if they would be allowed too by the govt.
 
But Intel has history of piss poor GPUs. It'll take a lot for enthusiast to look past that.
It's not like they've been trying... They're aiming for the cheapest POS they can possibly build which can still handle Windows.

Saying they're piss-poor because they don't stack up against gaming cards is like berating the processor in your microwave for being slower than an i7.
 
It's not like they've been trying... They're aiming for the cheapest POS they can possibly build which can still handle Windows.

Saying they're piss-poor because they don't stack up against gaming cards is like berating the processor in your microwave for being slower than an i7.

The fact is that Intel has yet to create a decent GPU. Even their first real attempt(i740) didn't have any success. And Larrabee in it's first form couldn't cut it. Intel has a lot of success in the computing environment, but 3D gaming isn't one of them and I don't see that changing for a while.

Once integrated GPUs are added to the CPU by both Intel and AMD, the market for add-in cards is going to get a lot smaller(cards like Nvidia 9500 and GT310, which makes up a large part of the current market, should become obsolete). Any card released by Intel is going to be aimed at the HPC crowd and not 3D centric.
 
Last edited:
But Intel has history of piss poor GPUs. It'll take a lot for enthusiast to look past that.
I think what ShuttleLuv means is that if Intel were to release a stand-alone GPU, there'd be a lot of average joes that would think it would be the best in the world, even though it would probably suck ass. As answer to the OP's question, no Intel should not get back into the GPU market. Intel can just keep making amazing CPUs and I'll be happy.
 
It would be real nice to have a 3rd competitor throw its hat into the ring to bring more competition to the playing field and hopefully help lower GPU prices in general. But it better be FAST and it better be cost efficient or it's just going to be a financial albatross for Intel.

This. It doesn't matter if it's Intel, it just has to be a solid 3rd competitor in the market.
 
I think what ShuttleLuv means is that if Intel were to release a stand-alone GPU, there'd be a lot of average joes that would think it would be the best in the world, even though it would probably suck ass. As answer to the OP's question, no Intel should not get back into the GPU market. Intel can just keep making amazing CPUs and I'll be happy.

+1. There would be a little too much bias toward Intel for the average joe. They have great marketing and simply are a KNOWN brand. Much more so than AMD although AMD is starting to gain a little bit of traction. I know quite a few people with store bought computers that are AMD powered and they are very happy with them.

However I think Intel dominates with almost 80% of the market both in processors and integrated graphics (please correct me if my number is off there; just a guesstimate). If they were to produce discrete graphics cards that actually competed evenly with AMD/ATI and Nvidia and were really successful, it may be a bad thing.

How could it be a bad thing? Well i get the impression that if intel wanted to, they could produce a super high end chip that smokes ATI and Nvidia EVERY generational release. Then they release their mid range which (not surprisingly) could be better than ATI/NV and be much cheaper if they wanted to price it that way. So much so that in the long run, it could drive one of the current members of the discrete club out of business (hint: it wouldnt be AMD).

In that scenario, it seems that AMD/ATI are in a better situation if that ever occurred than NV. They have great cpus (not as powerful as intel, but very good and affordable), good graphics cards, and have a great portfolio of solutions for alot of very well known products (ex. graphics provider for XBOX 360). Nvidia on the other hand is only making graphics cards, PS3 graphics and chipsets.....

One thing that DOES kill both companies is a LACK of owning their own fabs. Intel for all intents and purposes can make waffer after waffer of prototype chips during R&D and not worry about the price (materials but thats about it) where as AMD and nvidia both have to pay for every waffer they want to try out at whatever price the fab wants.

As a guy whos has owned products from all three companies, Id love to see some competition from a 3rd party, but it really feels like if Intel did jump into the market, it would be like waking up a sleeping giant! They'd get beat up the first two rounds they compete in but after that? All bets are off.
 
If it benched well and drivers were stable, I'd think about biting.
 
If Intel bought Nvidia. Then Nvidia gpu's would be insane being made at intel fabs. Can you imagine how good fermi would be if it was made at intel fabs, it would be low power, low heat. But I doubt it will happen not sure if they would be allowed too by the govt.

AMD bought ATI and AMD is still contracting to TMSC, so, whats your bases for making that assumption?

How could it be a bad thing? Well i get the impression that if intel wanted to, they could produce a super high end chip that smokes ATI and Nvidia EVERY generational release. Then they release their mid range which (not surprisingly) could be better than ATI/NV and be much cheaper if they wanted to price it that way. So much so that in the long run, it could drive one of the current members of the discrete club out of business (hint: it wouldnt be AMD).

In that scenario, it seems that AMD/ATI are in a better situation if that ever occurred than NV. They have great cpus (not as powerful as intel, but very good and affordable), good graphics cards, and have a great portfolio of solutions for alot of very well known products (ex. graphics provider for XBOX 360). Nvidia on the other hand is only making graphics cards, PS3 graphics and chipsets.....

One thing that DOES kill both companies is a LACK of owning their own fabs. Intel for all intents and purposes can make waffer after waffer of prototype chips during R&D and not worry about the price (materials but thats about it) where as AMD and nvidia both have to pay for every waffer they want to try out at whatever price the fab wants.

As a guy whos has owned products from all three companies, Id love to see some competition from a 3rd party, but it really feels like if Intel did jump into the market, it would be like waking up a sleeping giant! They'd get beat up the first two rounds they compete in but after that? All bets are off.

Doubt that for two reasons: 1) patents, 2) talent. Analysis and design of CPUs is not like analysis and design of GPUs. Intel has thousands of engineers all oriented around the X86 instruction set, few of whom are oriented around pushing pixels. None of these guys have the experience or expertise to start working on a project involving modern GPU compute architecture. Half the people at intel would be able to tell you, in good detail, about exactly what happens when you call realloc(ptr, size), but few would be able to tell you what happens when you call DeleteTextures(sizein, txtptr) (part of the openGL spec).

Intel has consistently made bad graphics products for years (I understand G945's power envelope and I understand its not supposed to do spectacular, but it would be nice if it could do 264 video decoding, like oh I dunno every other graphics product on the market).
 
Last edited:
Doubt that for two reasons: 1) patents, 2) talent. Analysis and design of CPUs is not like analysis and design of GPUs. Intel has thousands of engineers all oriented around the X86 instruction set, none of whom are oriented around pushing pixels. None of these guys have the experiance or expertise to start working on a project involving modern GPU compute architecture.

Noted.

And that link was pretty cool. Until you read things like that, you never really understand how COMPLICATED this stuff can be. I salute them!
 
AMD bought ATI and AMD is still contracting to TMSC, so, whats your bases for making that assumption?

Cause intel has a lot of fabs. Why use TSMC which suck at doing 40nm when intel does it just about better than anyone in the world. Just makes sense to use the fabs that will make a better product. Hell I bet if intel did fermi on their 45nm it would run cooler and use less power than TSMC 40nm.
 
Doubt that for two reasons: 1) patents, 2) talent. Analysis and design of CPUs is not like analysis and design of GPUs. Intel has thousands of engineers all oriented around the X86 instruction set, few of whom are oriented around pushing pixels. None of these guys have the experience or expertise to start working on a project involving modern GPU compute architecture. Half the people at intel would be able to tell you, in good detail, about exactly what happens when you call realloc(ptr, size), but few would be able to tell you what happens when you call DeleteTextures(sizein, txtptr) (part of the openGL spec).

And you don't think Intel can't just buy out some of the "talent" with the deep pockets that they have? You don't think some of these gpu engineers wouldn't mind switching jobs for say ... a 50% pay raise? How about a 100% pay raise? In the end money talks. So far Intel hasn't look super serious about getting into the GPU business, not when they launch, excuse me, fail to launch half-assed products like Larrabee.

Intel has consistently made bad graphics products for years (I understand G945's power envelope and I understand its not supposed to do spectacular, but it would be nice if it could do 264 video decoding, like oh I dunno every other graphics product on the market).

So far Intel has been content with making integrated graphics components for their chipsets. But if they ever decided to get serious about discrete graphics, there's no doubt they have the money and resources to make themselves viable contenders. The main doubt is whether they want to INVEST the money to do so. The only reason why they haven't done it so far is probably because they don't have enough of a profit motive to make that move ... yet.
 
So far Intel has been content with making integrated graphics components for their chipsets. But if they ever decided to get serious about discrete graphics, there's no doubt they have the money and resources to make themselves viable contenders. The main doubt is whether they want to INVEST the money to do so. The only reason why they haven't done it so far is probably because they don't have enough of a profit motive to make that move ... yet.

you should read more before making such ignorant post, intel tried to make one but failed miserably. and money means shit, microsoft is twice as big as intel, but it still fails in many new products.
 
But Intel has history of piss poor GPUs. It'll take a lot for enthusiast to look past that.

And most people wouldn't care cause their largest market would be in OEM systems, which is already filled with Intel IGP anyways.........
 
I was under the impression that the "hardware" Larabee would return at some point, or when intel deemed fit.
 
AMD bought ATI and AMD is still contracting to TMSC, so, whats your bases for making that assumption?

It takes time to develop GPU architectures, years of time, and this generation with Globalfondries perusing a different manufacturing method for it's cutting-edge processes this time around from TSMC means that ATI has to redesign their chips in order to make the transition, whether they're in the pipeline or actually being sold. That's why it's taking until Northern Islands in order for GF to take over.
 
And most people wouldn't care cause their largest market would be in OEM systems, which is already filled with Intel IGP anyways.........


From what I took from Op's initial post, he's asking about high performance dedicated cards. The majority of people who buy these cards do have a pretty good grasp of the market. The one's who don't tend to go Nvidia becasue it's the brand mentioned the most.

For OEM systems, there's not going to be a need for any kind of videocard anymore, since it's being pushed onto the CPU by both camps. The only way a dedicated high performance card makes sense for Intel is if they want to beat Nvidia in the HPC sector. The real test is to see how the x86 architecture works for these loads. Knowing Intel's software ability, they would be able to get it right sooner rather than later.
 
More competition is almost always a good thing.

Less of a chance for any of them to rest on their laurels...

Agreed , it sucks that Intel loves to talk about getting into the GPU market then they always cancel projects relating to it.

I would really love to see a 3rd contender in the GPU gaming market.
 
I believe AMD or Intel tried to buy nVidia before, but Hsu wanted a CEO or president position or some shit like that.

lol.. who is Hsu? The CEO of NVIDIA is not Hsu. Should probably do your research before you post.
 
lol.. who is Hsu? The CEO of NVIDIA is not Hsu. Should probably do your research before you post.
Hsun, Hsu, whatever, you know who I'm talking about -- I'm not going to remember some unimportant wanker's name.
 
Back
Top