Apple Looking For Ways To Ditch Intel?

Both Intel and AMD are x86/x64. Why would switching to AMD require anything other than a few tweaks? And that is even if they bothered to do any tweaking since it would run fine as-is?

That's why I said using AMD would be an option.
 
LOL @ Making their developer community port everything.... AGAIN...

Motorola 68000 --> PowerPC (1994)

Mac OS --> OS X (2001)

PowerPC --> x86 (2006)

x86 --> Homegrown ARM? (2013?)

I don't understand why Apple developers don't run away screaming.
 
AltiVec 2.0 here we come, with 200% photoshop acceleration.

Which means the real question does Apple think they are going to get vastly better than Intel at designing a GPU or how are they going to get nVidia or AMD to do it for them (possibly buying the whole company).

Buying AMD is a non-zero possibility. I would be even less surprised if they bought AMD and still switched to ARM (included in the plan to buy AMD).

OpenCL certainly looks like it should be handling most of the heavy lifting a mac notebook does, and they certainly would be willing to inflict whatever on the desktop line to serve the mobile. My guess is that this should be re-titled "is Apple looking for ways to ditch the desktop"?
 
Do you folks forget that Apple used their own processors for many years?

There is a reason they stopped and went Intel.

Nope.

They never used their own processors.

The original Macs used Motorola 68k series processors (a great CPU for its time, used in many applications including the Amiga, and still used in many embedded systems)

Starting in the mid 90s PowerPC Macs used PowerPC chips. Apple was part of the consortium involved in bringing them to life, but development was mostly IBM (as part of their POWER architecture), and production was split between Motorola and IBM. Apple was mostly involved due to their status as the biggest customer of the chips.

Then they switched to x86 in 2006.

The only chips of Apples own design they have ever used are the mobile chips codenamed Axx, and while these are in house designs, they rely heavily on license and work done by ARM.
 
This is another of those brain-dead rumors--based on purely uninformed speculation--in the wake of ARM's announcement that "by 2015" it would be able to license a single-threaded 64-bit ARM cpu with the rough performance parity of a '486-Pentium 90MHz. There is so much bad information out there about Apple it boggles the mind. Some people think Apple "made its own cpus" at one time--unbelievable. The only place ARM performance is "catching up to Intel" is in the imaginations of some people.
 
This is another of those brain-dead rumors--based on purely uninformed speculation--in the wake of ARM's announcement that "by 2015" it would be able to license a single-threaded 64-bit ARM cpu with the rough performance parity of a '486-Pentium 90MHz. There is so much bad information out there about Apple it boggles the mind. Some people think Apple "made its own cpus" at one time--unbelievable. The only place ARM performance is "catching up to Intel" is in the imaginations of some people.

Well, current designs are highly focused on power efficiency.

I don't think we know how well ARM would scale if given a 35, 65, or even 125W power envelope.

My guess is not even close to AMD or Intel at first, but to be fair, I just don't think this information is available, cause no one has been silly enough to do it :p
 
Why not? You need some powerhouses for Photoshop, 3D Rendering, etc. but the majority of Apple's customers look at photos, browse the web, and listen to music. The more they keep in-house, the higher their profit margins. Seems like a smart business decision.

Sure as heck won't be an Apple factory making the chips, they want to get away from Intel, but outsource to someone like Samsung, LG, or Foxconn, all of which is keeping trade secrets about as secretive as a streaker at the superbowl.
 
ARM is going 64-bit in 2014.

Also, some of you seem to be forgetting why Apple switched from PPC to Intel: IBM utterly failed to develop a 3GHz version of the PowerPC G5... for two years.

Jobs famously said they had been concurrently developing OSX for Intel 3 years prior to that. Fact is that any reason Apple will use to jump ship, they will. The fact that switching to ARM would give them unprecedented lock down on their hardware is just a little bonus, I'm sure.
 
I think AMD being bought by apple would be a more likely scanario, AMD is already partnering with ARM to put ARM tech in the opterons, maybe AMD coupled with Apple's money could get OpenCL and ARM apps mainstream...

Anyone have any programming experience in both x86 and arm care to talk about the advantages and disadvantages of going the ARM route?
 
I think AMD being bought by apple would be a more likely scanario, AMD is already partnering with ARM to put ARM tech in the opterons, maybe AMD coupled with Apple's money could get OpenCL and ARM apps mainstream...

Anyone have any programming experience in both x86 and arm care to talk about the advantages and disadvantages of going the ARM route?

Unless you are doing assembler programming, I'd say none.
 
Unless you are doing assembler programming, I'd say none.

I don't think you have to go quite that low but yeah, most coding done in a modern development tool that's above a low-level API level should be pretty much the same.
 
I've yet to see a Apple user that I would call a power user.

Some next gen ARM design would cater just fine for most Apple users and most Windows users too.

I know a lot of folks using dual core 1.6Ghz Atom boxes for general business use quite happily.
 
I've yet to see a Apple user that I would call a power user.

Some next gen ARM design would cater just fine for most Apple users and most Windows users too.

I know a lot of folks using dual core 1.6Ghz Atom boxes for general business use quite happily.

They tend to think they are power users though.


"I need these 16 cores and 32 GB of RAM my Mac Pro give me for my, uh , Photoshop.

Right, Photoshop, that's it!"
 
Zarathustra[H];1039300321 said:
They tend to think they are power users though.


"I need these 16 cores and 32 GB of RAM my Mac Pro give me for my, uh , Photoshop.

Right, Photoshop, that's it!"

Actually Final Cut Pro, and Adobe Premiere Pro and After Effects will quickly burn through 32GB and can easily use more than 16-cores.
These individuals are few and far between, but they do exist.
 
Zarathustra[H];1039300321 said:
They tend to think they are power users though.

"I bought this because it's what the professionals use!" - Every art student with too much of their parents' money.
 
I like that, apple buying amd. Why don't they just copy an intel cpu and make it better. Amd is like, dead.
 
I like that, apple buying amd. Why don't they just copy an intel cpu and make it better. Amd is like, dead.

They would first have to convince Intel to sell them an x86 license.
Then they would have to build up the resources to make them.

I don't see it happening, at least not with x86.
 
They would first have to convince Intel to sell them an x86 license.
Then they would have to build up the resources to make them.

I don't see it happening, at least not with x86.

I might be wrong, but I think he was making a joke about Apple copying Intel.
 
I like that, apple buying amd. Why don't they just copy an intel cpu and make it better. Amd is like, dead.
Apple is a company with a "good enough" attitude, so AMD would work fine for them.

Why spend money to excell when you can charge a premium for merely "good enough" products?
 
I like that, apple buying amd. Why don't they just copy an intel cpu and make it better. Amd is like, dead.

Better yet, Apple could buy AMD, design an x86 chip, patent it, and then sue In tel for making x86 chips. Fits right in with their business plan now.
 
Better yet, Apple could buy AMD, design an x86 chip, patent it, and then sue In tel for making x86 chips. Fits right in with their business plan now.

Except the x86 license that AMD has is NON Transferable. If Apple bought AMD, they would get AMD but not that licence.
 
Except the x86 license that AMD has is NON Transferable. If Apple bought AMD, they would get AMD but not that licence.

Interesting, I didn't know that. If that's the case then AMD is a much less attractive buy I think.
 
I don't think you have to go quite that low but yeah, most coding done in a modern development tool that's above a low-level API level should be pretty much the same.

There are some major differences, if nothing else, the ARM Cpu's I've looked at didn't have an FPU and did it all in emulation. (Yes, in modern GPU computing "world" it doesn't matter, but I don't know of any app devs going, yes I want to write assembler code for my GPU too.)

If you're writing hello world, than no, not really.

There are a ton of CPU comparison articles on Arstechnica, but basically, I think of Arm as intel 10 years ago.

Everybody always wants to count Intel out, but they have some of the best CPU engineering on the planet. Their fabs are 22nm with 3d gate tech, something the ARM fabs do not have. They have the best branch predictor hardware on the planet, something arm until recently didn't have either. (Granted on the A1-A6, a 3 step CPU, a branch predictor is kind of pointless)

It is going to get interesting in a lot of different way in the next couple of years.
 
But, doesn't the 64 bit stuff. Wasn't it not made by amd first? and now Intel is using it??
 
There are a ton of CPU comparison articles on Arstechnica, but basically, I think of Arm as intel 10 years ago.

They have the best branch predictor hardware on the planet, something arm until recently didn't have either.

It's not like they couldn't build it. Branch prediction papers have been published for a long time, through the 80's and 90's. It is all due to the power constraints that they set on themselves. Same thing for their caching design, made mostly to reduce power usage instead of low latency/higher bandwidth.

The question isn't ARM building a mobile/power constraint CPU that would compete with Intel, but with an ARM built for workstations/servers. Especially with any company being able to purchase the ISA and work on the microarchitecture themselves.
 
It's not like they couldn't build it. Branch prediction papers have been published for a long time, through the 80's and 90's. It is all due to the power constraints that they set on themselves. Same thing for their caching design, made mostly to reduce power usage instead of low latency/higher bandwidth.

The question isn't ARM building a mobile/power constraint CPU that would compete with Intel, but with an ARM built for workstations/servers. Especially with any company being able to purchase the ISA and work on the microarchitecture themselves.

Interesting, I didn't know that. If that's the case then AMD is a much less attractive buy I think.

It would be, to acquire talent (or at least whats left of it)
 
I dunno they've made great strides with their A series chips. They could scale it nicely and do well I think.
 
There are some major differences, if nothing else, the ARM Cpu's I've looked at didn't have an FPU and did it all in emulation. (Yes, in modern GPU computing "world" it doesn't matter, but I don't know of any app devs going, yes I want to write assembler code for my GPU too.)
I'm not sure what ancient ARM CPUs you've been looking at lately, but even the ARM11 CPU design from 2003 used on the Raspberry Pi 1 has an integrated FPU, and all other ARM CPUs, perhaps save for the embedded controllers, have FPUs. ;)
Everything is using armhf, and armel has been all but completely phased out, including Raspbian OS.


EDIT: ...and just saw that post was from 2012. :p
But still, even ARM CPUs back then had integrated FPUs, so my statement still stands, haha!
 
Back
Top