Apple Plans to Use Its Own Chips in Macs from 2020, Replacing Intel

Discussion in 'HardForum Tech News' started by Megalith, Apr 8, 2018.

  1. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,931
    Joined:
    Jun 13, 2003
    Nice wall of text, but- none of that addresses that Intel stopped producing CISC processors with the Pentium over two decades ago, and AMD followed suit with the Athlon- and even before that they were tacking on various types of extra compute units.
     
    SvenBent likes this.
  2. ChadD

    ChadD [H]ardness Supreme

    Messages:
    4,207
    Joined:
    Feb 8, 2016
    At best the newest x86 designs can be considered hybrid designs. At their core they are still x86 instruction set chips. They still have caches and prediction engines, and perform calculations across cycles which is the very definition of a CISC design.

    ARM is no doubt about it a pure RISC design. The faster memory gets the faster ARM will get >.<

    For those not up on the technical stuffs.. a RISC design stores its math in memory of some type, not transistors. So long complicated math is slowed down not only by the generally lower clock rate of a RISC design but by the speed of the memory its storing that math in. ARM chips have been catching up to x86 as memory speeds increase and that isn't going to slow down.

    Like it or not x86 is not an architecture that is well suited to the future of general computing.

    There are a lot of reasons why ARM will win in the end and end up running the vast majority of all computing devices including high end servers and even super computers.

    http://semiengineering.com/coherency-cache-and-configurability/
    Read this its in fairly plain English and it should give you a good idea why ARM will no doubt win in the end. Its simply not possible for Intel to roll out hardware coherency in the same way... the day Intel cracks that in an x86 design and gives us a CPU that includes cores that are completely different from each other I will agree that x86 has a future.
     
  3. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,931
    Joined:
    Jun 13, 2003
    ARM will win in the end!

    I about fell out my chair.
     
    Armenius and ChadD like this.
  4. A Little Teapot

    A Little Teapot Limp Gawd

    Messages:
    146
    Joined:
    Dec 9, 2016
    "Sure, we've absolutely learned from our past mistakes. What we now know is that you need to have a significant market presence in order to force people to use proprietary technology."
     
  5. ChadD

    ChadD [H]ardness Supreme

    Messages:
    4,207
    Joined:
    Feb 8, 2016
    Glad you got a chuckle. Yes if x86 is still around in 10-15 years I would be fairly shocked. As awsome as things like the thread rippers of the world are. The market for that stuff is trying up.

    Sure some people got on the ARM is going to take over the server market wagon way to early a few years back... but that is still going to to happen. In the last 12 months multiple ARM super computer projects have gotten underway... including Crays highest end machines using thunderx2. Some called it to soon but the ARM take over of that market is still coming. One big reason is what I have been talking about coherency of design and interconnect of co-processors on silicon, in a way that doesn't drive the costs of software development into orbit.

    There is a ton of ARM r&d happening right now in some very interesting stuff that is going to start filtering into every day products at some point.
    https://www.engadget.com/2018/03/27/nvidia-arm-ai-iot/
    https://www.engadget.com/2018/02/13/arm-machine-learning-processors/

    The market for high end x86 chips is getting smaller... and yes ARM is going to overtake x86 designs in performance sooner rather then later. If and a company like Apple does start shipping Mac laptops and all in ones running ARM chips with specific ASIC units that make them chew threw the types of software most people are using day to day. Yes its going to eat into the x86 market even more.
     
    Red Falcon and Neo_East like this.
  6. Geef

    Geef Limp Gawd

    Messages:
    338
    Joined:
    Aug 5, 2009
    Apple isn't going to make its own chips. Some dude in China making liike 2 dollars a day will be making them. Apple will be selling them though.
     
  7. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,931
    Joined:
    Jun 13, 2003
    x86 was supposed to die 20 years ago. SInce it isn't really x86 anymore, it can be morphed into whatever is needed.
     
  8. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,395
    Joined:
    Aug 28, 2008
    Well it explains why Apple has virtually abandoned HPC.
    They don't need high powered computing anymore. Hipster flakes can edit photos and edit ARM compatible Final Cut YouTube video's on future Apple products just fine. They are just abandoning markets that made them zero money. The desktop isn't dead, but it is to Apple.
    It is the right move for the company and will financially rewarding.
     
    ChadD likes this.
  9. Uvaman2

    Uvaman2 2[H]4U

    Messages:
    3,143
    Joined:
    Jan 4, 2016
    Well if their cpu is not as powerful as to replace what is on the mac pro, they can still do it in notebooks, and small desktops. Also been thinking abou amd videos cards.. would it come a day when they add a cpu to it, and have a whole compute unit in there?. Dont know maybe that is how you make a mac pro... But that sounds too complex software wise.. maybe they can do arm- radeon ssg mac pro and still be very powerful anyway... Regardless, i still think this is about extracting things from Intel, which intel may or may not care, I think they ate busy enough to maybe dump apple .
     
  10. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,641
    Joined:
    Jul 9, 2012
    The main problem with PPC was that IBM wasn't really able to clock the architecture up (due to PPCs design), and they ran out of easy ways to improve IPC. When Intel released Core 2, it was pretty much dead in the market. Prior to that, PPC chips were quite competitive with everything else out there; it was a solid architecture.
     
    Red Falcon and Brian_B like this.
  11. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,641
    Joined:
    Jul 9, 2012
    Uhhh...no.

    Thread assignment is [typically] handled by the OS at runtime for both ARM and x86. Compilers have nothing to do with it [unless developers manually make assumptions about the underlying host CPU, which is typically frowned upon]. Even ARMs Big/Little is handled at runtime by the OS. I all have to do as a developer is spawn a thread, and leave the rest up to the OS scheduler.
     
  12. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    I honestly believe the PPC Mac's were the last real Mac's.

    Like the Motorola 68k, nothing lasts forever in the tech world - Not even Windows will last forever.
     
    Red Falcon likes this.
  13. aaronspink

    aaronspink 2[H]4U

    Messages:
    2,122
    Joined:
    Jun 7, 2004
    That has basically nothing to do with ISA. The reality is that ISA largely doesn't matter. RISC vs CISC is largely a pointless debate devoid of any real substance or impact. ARM is basically confined to the markets they've always been confined to and intel processors likewise are largely confined to the markets they've always been confined to. The reality is that legacy software systems matter more than ISA. Always have, always will.

    As far as machine learning/AI, that is all moving to dedicated hardware and away from both CPUs and GPUs.

    You don't appear to have any fundamental understanding of computer architecture.
     
    Absalom and SvenBent like this.
  14. aaronspink

    aaronspink 2[H]4U

    Messages:
    2,122
    Joined:
    Jun 7, 2004
    No, the primary issue was that IBM/Motorola weren't making enough money off of the limited market available to sustain development of consumer level chips.
     
    Red Falcon, SvenBent and grtitan like this.
  15. aokman

    aokman Gawd

    Messages:
    801
    Joined:
    Jan 3, 2012
    Well duh... we have been predicting this for years. Apples silicon is might competitive compared to Intel and the power consumption is the icing on the cake.
     
  16. chithanh

    chithanh Gawd

    Messages:
    781
    Joined:
    Oct 18, 2010
    One issue is x86 and the other is the PC platform, which are both built on a horrible pile of legacy stuff dating back to early 1980s. It is absolutely holding implementors back, but Intel being an x86 one-trick pony will of course tell you otherwise.

    I mean, how long did it take until we got rid of the A20 gate?
    And Intel keeps piling cruft upon cruft. Just recently, we got 5 level paging (each level is 1 layer of indirection, translating a virtual address to another, until finally to a physical address).

    Apple makes choices for their ARM platforms that get rid of legacy stuff. Current generation iDevices no longer support 32 bit apps, which allows Apple to drop 32 bit from their hardware entirely. I think we are at least a decade away from x86 processors which are 64-bit only.
     
  17. Phattio

    Phattio [H]Lite

    Messages:
    69
    Joined:
    May 20, 2006
    i dream of AmigaOS being ported to x86 (with support for latest hardware). (and yes, i know about amiga forever/amigaos 4.1 classic.)
     
    WhoMe likes this.
  18. Neo_East

    Neo_East n00b

    Messages:
    28
    Joined:
    Feb 10, 2018
    x86 is dead, the time of its inevitable technological obsolescence is here. ARM has been on an easily identifiable disruption trajectory with x86 since smartphones began (actually even earlier than that). The ubiquity of the ARM ISA will explode in the next couple of years with the incoming IOT explosion which will be dominated by ARM the same way mobile was.

    Not only the Apple Mac ARM switch should be talked about but Microsoft's Windows and Azure ARM transition occurring in parallel should be recognized. Just last week Microsoft expanded Windows on ARM to include 64-bit app ARM support for both UWP and Win32. This opens the floodgates to completely transition Windows off x86/Win32. Recall that Microsoft also announced a major reorganization of Windows the prior week that marks the beginning of the end of the old Win32/x86 paradigm of Windows.

    Not only is Microsoft backing ARM ISA but getting in on the custom ARM chip game themselves. It was announced last week that HoloLens 2 will switch from Intel to custom Microsoft ARM w/ neural co-processors. You're going to see the same trend continue with Surfaces and even XBOX. Surface Andromeda folding tablet concept may make its debut next month at Build and it's an ARM device.

    Microsoft's commitment to ARM servers is real and high-level and not some experimental niche exercise. Microsoft's endgame is to transition Azure to its own custom ARM silicon saving billions from Intel dependency and giving it a competitive advantage over AWS.

    https://www.geekwire.com/2018/micro...nds-everyone-company-house-silicon-expertise/
     
    Mazzspeed and ChadD like this.
  19. spugm1r3

    spugm1r3 [H]ard|Gawd

    Messages:
    1,153
    Joined:
    Sep 28, 2012
    It's funny, because I used to buy Apple, and this is always my lament. Even though the later PowerPC chips lagged, it always felt like Apple switching to Intel was when they lost their way. No, I'm not ignorant to the fact they also started making gobs of money around this same time. It just started to feel like your brilliant pothead best friend grew up to be a self-help guru.
     
    Mav451 likes this.
  20. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,934
    Joined:
    May 9, 2006
    Look a the instruction sets on a CPU. See what they can do. Those are what we would call task specific instruction sets. Pathways for specific types of tasks to be able to be executed swiftly without having to use all of the silicone of a CPU. These are what you can call a RISC path. Today's Intel CPU's and many of the others in your phones and other devices follow a similar process of design and actually license the instruction sets from other manufacturers to ease development. (This is also the reason that vulnerabilities like Meltdown/Specter afflict so many different CPU types/generations.)

    An RISC processor is one that has support for Reduced Instruction Sets. Meaning it doesn't have a tone of built in general use logic in the CPU itself. If it is going to do tasks that need general CPU those tasks must be coded in a way as to run in that sort of environment. The benefit of such a CPU is that you do not need a ton of task specific silicone in your design. (Specifically targeted RISC Unit's if you will.) You simply let developers code how they want their program to work. Then compile it using a compiler and code language that knows how to take advantage of the RISC unit to complete it's needed compute.

    The problem here is then you run into a different roadblock. Instead of having a lot of specific Instruction sets you can call in to speed things like your PCIE calls and other such you must now rely on your compiler in order to properly handle that for a RISC based CPU. Yes this lets your program be able o run on a large variety of CPU's presuming they all use the same sort of RISC pipeline. But the performance of what we are calling an x86 CPU (inaccurate as it is.) with in essence multiple RISC path's built into it is going to execute those paths more efficiently. So the trade-off is... do you want the faster instructions for a smaller subset (of very popular) instruction set executions. OR do you want the generalist that can handle anything at near the same speed provided it was coded well and using a cutting edge compiler?

    Remember when knowing if a game was written in C+ or C++ actually had an impact on how it would run? If we go RISC across the board that will be needed again.

    RISC is AWESOME for grid based compute. Because you don't rely on specific instruction sets and can write code that will execute in predictable timings across dozens to hundreds, to thousands of CPU units. You don't need to spend a ton of CPU cycles on assigning x to y for z. These are great things. For cutting edge science, and super computer needs that is the kind of real performance you want/need. But for a more limited use device. (Laptops and Desktops and such. Even many mid to large size single servers.) You want a good performing CPU that can run at a break neck speed for each and every specialized compute request you throw at it.

    Do I think RISC units will replace 'x86'. No.

    My prediction is we will drop the term x86 as we move into 64bit and 128bit compute units in our devices. We will have very RISC like instruction cycles but it won't be a true RISC unit any longer because of the presence of so many other task specific logic paths.

    But that's just my take... I could be wrong.
     
    Khahhblaab likes this.
  21. grtitan

    grtitan Telemetry is Spying on ME!

    Messages:
    1,266
    Joined:
    Mar 18, 2011
    Bingo!

    Due to a lack of native applications, all OS's that were ported to PPC didnt received much attention from regular desktop users.

    Apple was rumored or expected to release their OS on other compatible PPC systems, but that never happened, so in the end, the only company that was buying PPC in any quantity was Apple and that wasnt enough to justify the development.

    A shame really, PPC was always faster than anything intel had, until they started lagging in development.
     
  22. aaronspink

    aaronspink 2[H]4U

    Messages:
    2,122
    Joined:
    Jun 7, 2004
    No this is loaded with BS. The reason that Meltdown/Spectre affects so many CPUs is that it utilizes holes that are in widely used structures for CPU design.

    Oh BS. That's quite possibly the worst load of bollocks in this whole thread.

    CISC isn't really a thing except to say not RISC. RISC really isn't a thing either. More of an idea if anything. There are very simple CISC and very complex RISC designs. And idea of CISC didn't exist until after RISC was coined. There is no real well defined break point that makes something CISC or something RISC. Could reasonably make a RISC design that allows REG-MEM ops.
     
    SvenBent likes this.
  23. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,934
    Joined:
    May 9, 2006
    Perhaps you should read up on the topic.

    https://en.wikipedia.org/wiki/Reduced_instruction_set_computer

    I hope that helps!
     
  24. fuzzylogik

    fuzzylogik Gawd

    Messages:
    631
    Joined:
    Aug 30, 2012
    Heh! This will happen once Apple feels like they can make a lot of money off it. I mean, what's a better way to do planned obsolescence than releasing the new version of OSX (or OSXI) where folks can't just upgrade - they MUST purchase new hardware because of a new custom chip? And then after a while, they can revert back to a different type of CPU that forces this all over again?
     
  25. aaronspink

    aaronspink 2[H]4U

    Messages:
    2,122
    Joined:
    Jun 7, 2004
    I have designed microprocessors for a living. The Wikipedia, as simple as it is, doesn't agree with you either. I fully stand by my previous comment. As would pretty much anyone familiar with the history of computer ISAs.
     
    SvenBent likes this.
  26. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    28,587
    Joined:
    Oct 29, 2000
    Poor Mac users going to be force fed yet another architecture switch.

    Other than that though, I'm curious to see what Apple brings here.

    I'm no Apple fan. I loathe their designs, their operating systems and their software with their walled garden approach, but as unlikely as this would have sounded a few years back when they switched from Samsung's design to their in-house A4 series chips in ~2010, Apple makes the best, highest performing mobile chips, hands down. Nothing from the competition comes even close.

    If I could get an iPhone with an unlocked bootloader and available drivers so you could roll your own android ROM I'd do it in a second. The hardware is that good. Nothing comes even close to touching the A11 chip they are selling now.
     
  27. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    28,587
    Joined:
    Oct 29, 2000

    Speak for yourself.

    I just took apart one of the instruments my company makes, pulled out the main board in it and looked at the chip. A Motorola 68030 SOC. It made me smile.

    This shit lives on forever in embedded applications :p
     
  28. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    Don't worry, I've got a device running a 68030 here myself. Nothing like the good 'ol days!

    In fact the 68000 series are still used in a number of automotive applications, chances are your car has one in it's PCM.
     
    Zarathustra[H] likes this.
  29. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    And this is a valid point, there's ARM and then there's Apple's ARM based SoC - Apple's custom designs are far faster than the competition.
     
  30. nutzo

    nutzo [H]ardness Supreme

    Messages:
    7,380
    Joined:
    Feb 15, 2004
    x86 gone in 10-15 years? Really doubt it.
    Many small companies will still be running the systems they are buying today :p
    Office workers will still be running Windows 10 and office 2016 on an i5 CPU.

    More than likely the higher end users will be switching to the 256 bit version Windows 15 and complaining that our old 32 bit apps no longer work.
     
    ChadD and Khahhblaab like this.
  31. Khahhblaab

    Khahhblaab Limp Gawd

    Messages:
    481
    Joined:
    Apr 23, 2017
    ..............and dont forget that at the very least, the processors in the tool sets that are making the chips will be x86.
     
  32. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,641
    Joined:
    Jul 9, 2012
    Chicken/Egg.

    But yeah, having Macintosh as your only consumer level customer (legacy PPC still rules the embedded market; even ARM hasn't really broken in yet) certainly hurts. Even then the arch was attractive until it topped out, and Mac went x86 as a result.

    Such a shame too; like 68k before it, PPC is a MUCH cleaner arch then x86 is.

    68k based chips are still used in the embedded market as cheap micro-controllers, where their price/performance/power ratio makes them attractive.

    Then again, this is the market where you still see Z80's and 286's in every day use.
     
  33. aaronspink

    aaronspink 2[H]4U

    Messages:
    2,122
    Joined:
    Jun 7, 2004
    Eh what? ARM has literally owned the embedded markets since its first release. It is what it retreated to when it bombed out of the PC market on their first product. The only place in the embedded markets where PPC has ever had a toehold was at the very highest end of it. All the volume of the embedded market has been ARM for decades. Hell, more ARM cores ship in disk drives every year than PPC has sold in its entire history.

    Architectural cleanness basically means F all. Always has, always will.
     
  34. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,645
    Joined:
    Feb 1, 2005
    So I guess Apple really does want to try to convince me to spend $1000+ on a fancy Chromebook.
     
  35. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    Nothing wrong with a Chromebook, they're undoubtedly growing in popularity. The Google Pixelbook is upwards of $1000.00.
     
  36. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,645
    Joined:
    Feb 1, 2005
    Pixelbook has an i5 processor and 8GB of RAM.

    My point is that I can buy a cheap ARM based Chromebook for $100-150 rather than a $1500 ARM based Macbook that's good for surfing the internet.
     
  37. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    You have to consider the fact that if Apple go down this path the processor is going to be heavily modified and most likely quite a capable device. Besides, everything's going to the cloud, I wouldn't be surprised to see Microsoft push their OS towards the cloud since they've announced that Windows is a low priority in favor of their cloud devision.

    Times are changing and there was a time where no one thought the Motorola 68k would be toppled from it's perch regarding personal computing in the day.
     
  38. hmz

    hmz 2[H]4U

    Messages:
    3,038
    Joined:
    Jul 26, 2005
    I doubt it will happen, but Moore's law doesn't apply to Apple chips or what?
     
  39. Mazzspeed

    Mazzspeed [H]ard|Gawd

    Messages:
    1,916
    Joined:
    Dec 27, 2017
    It's happened time and time again in the past, there's absolutely nothing to state that history doesn't apply to x86/64.
     
  40. ChadD

    ChadD [H]ardness Supreme

    Messages:
    4,207
    Joined:
    Feb 8, 2016
    I think you mistake ARM as low end chips. ARM is just an architecture. The Apple A11 is 5-10x faster then the chips you find in the cheapo $100 chromebooks. The A11 in the current iphones is actually faster then the mid range 4 core i5s. If Apple does switch their laptop/desktop machines they won't be using A11s. They will be using A12x or A13 with custom ASIC designed for those machines. Yes Intel should be very worried, even if Apple doesn't switch in the next few years to their own custom ARM chips. ARM is with out a doubt starting to gain serious traction in Servers and even Super computing.
     
    Brian_B likes this.