Should Intel get back into the gpu game?

AMD bought ATI and AMD is still contracting to TMSC, so, whats your bases for making that assumption?



Doubt that for two reasons: 1) patents, 2) talent. Analysis and design of CPUs is not like analysis and design of GPUs. Intel has thousands of engineers all oriented around the X86 instruction set, few of whom are oriented around pushing pixels. None of these guys have the experience or expertise to start working on a project involving modern GPU compute architecture. Half the people at intel would be able to tell you, in good detail, about exactly what happens when you call realloc(ptr, size), but few would be able to tell you what happens when you call DeleteTextures(sizein, txtptr) (part of the openGL spec).

Intel has consistently made bad graphics products for years (I understand G945's power envelope and I understand its not supposed to do spectacular, but it would be nice if it could do 264 video decoding, like oh I dunno every other graphics product on the market).
Both of those issues Intel could easily solve. Patents can be overcome by cross licensing deals. Intel has patents for the x86 architecture that Nvidia wants access to, so there could be deals made there or in the chipset business. Talent could be bought or grown. Intel has large capital that they could throw at the issue. With Intel's fab tech if they got serious, both ATI and Nvidia would probably start hurting. Being pretty much one process ahead of TSMC would mean that an OK GPU could would become a good GPU just by pushing the clocks up. The only thing is Intel would need a good business reason to enter the market. Development of the 5870 took ATI almost 3 years and that is working with a lot of established IP and designers. Anything coming out of Intel would probably take at least 5 years to get to market with the first product being less than impressive. Until Intel has a good reason to sink billions of dollars over several years into a GPU they won't be doing it.
 
From what I took from Op's initial post, he's asking about high performance dedicated cards. The majority of people who buy these cards do have a pretty good grasp of the market. The one's who don't tend to go Nvidia becasue it's the brand mentioned the most.

For OEM systems, there's not going to be a need for any kind of videocard anymore, since it's being pushed onto the CPU by both camps. The only way a dedicated high performance card makes sense for Intel is if they want to beat Nvidia in the HPC sector. The real test is to see how the x86 architecture works for these loads. Knowing Intel's software ability, they would be able to get it right sooner rather than later.

Ya, my post was rather... redundant since intel IGP are all over already :)

Didn't know AMD was also working to integrate the GPU aswell!
 
Intel have proven to suck in the gpu department no matter what they release sucks compared to the cheapest offerings of nvidia / ATi
 
Really NVidia and ATi are both eating the scraps left over by intel already. Intel owns the integrated market and will likely gather even more so with the upcoming integrated graphics component in CPU's. The high end discrete market is only a small small percentage of the overall graphics chip marketplace. It just happens to be the "sexy" part of it and is waht the media concentrates on.

Although I agree their IGP should do blu-ray by this point in time.
 
Ya, my post was rather... redundant since intel IGP are all over already :)

Didn't know AMD was also working to integrate the GPU aswell!

They're testing a CPU based on the 10.5K architecture with what they call an APU. I believe it's codename for now is Llano. It's supposed to ship out in 2011, alongside Bulldozer. It's graphics performance is said to be better than the chip in the 890GX chipset. It will also have full DX11 support.
 
Intel have proven to suck in the gpu department no matter what they release sucks compared to the cheapest offerings of nvidia / ATi

I disagree. A few bad apples is something every company experiences. I can remember when the best Ati could do is the Rage 128 and now look at them. Don't be so closed minded. Intel could really pull this off. They have the resources, the name recognition, and the experience. It's up to whether they are dedicated enough or want to focus on cpus.
 
I disagree. A few bad apples is something every company experiences. I can remember when the best Ati could do is the Rage 128 and now look at them. Don't be so closed minded. Intel could really pull this off. They have the resources, the name recognition, and the experience. It's up to whether they are dedicated enough or want to focus on cpus.

Intel can make excellent hardware, and they can make good drivers for chipsets with *standardized* interfaces (SATA/RAID/PCIe/USB) . But where they have faltered is in the video drivers for their IGPs.

The reason drivers for enthusiast-oriented video is so hard is because NOTHING is "standard" about video drivers for gamers/video nuts.

There are an infinite number of combined shader programs that developers can use, and there are only a finite number of those shader combos that can be fully-tested in a reasonable amount of time. Further, if a developer decides to use an entirely new pipeline configuration, you have to optimize for that case, or else face the wrath of losing to your competitor in a top-tier game/benchmark. All this time, you have to make sure you're not breaking any of the infinite number of patches and performance tweaks you've already introduced to the code base. Chipset drivers have almost no complexity compared to this.

There are also a huge number of standards and proprietary features your card must support to remain competitive. Here is a short list of the "standards" I can come-up with from memory:

DirectX versions 3-11, OpenGL 1.0 to version 4.0, DXVA 1.0-2.0, OpenCL, DirectCompute.

And proprietary extensions/features like:

Stream, CUDA, 3D Surround, Eyefinity, SLI, CrossFire.

Mesh that together with a fancy interface to control all these features, and you've got one helluva job just producing something stable if you put a ton of effort in. Intel has NEVER taken-on a driver job like that and done it on-time, and that's just for their integrated GPUs with virtually no high-end features. Intel has *never* experienced the software hell that ATI and Nvidia know so well.

They might be able to learn on the job. But those first five years or-so are going to be hell in the driver department. And nobody is going to buy their GPU while Intel learns how to make complex software.
 
I would love to see Intel get in the GPU game. That way people can see what shitty drivers ACTUALLY look like.
 
Back
Top