Apple to Announce its own Mac Processor

Steve Jobs returning saved them from bankruptcy and gave them a second wind in the early 2000s, but that era is long gone. When he passed in 2011, Apple again started a downhill slide into what it is today, one of the most anti-consumer companies in the world. They're literally leading the fight against consumers rights to ownership and repair of their devices and treat their own customers like yesterdays garbage. They're again stuck in a position where they can't innovate and are doing everything they can to lock in users to a walled garden so they can't escape, and lock 3rd party vendors out so they hold all of the cards.

I wish folks would stop parroting "Steve Jobs was divine perfection" myths.

Many of the trends you're complaining about now started during his golden age. Who do you think put so much trust in Jony Ive and was happy to go with ever slimmer designs that had reduced ports and expandability? Remember, the original MacBook Air came out in 2008. Conversely, I'd say that Apple has had a few breakthroughs since 2011. The Apple Watch was the first smartwatch to really resonate with the public. AirPods and the wireless chips that have driven them helped Bluetooth audio take off. Even the 5K iMac is a notable innovation, since until then greater-than-4K displays were both unattainable for non-pros and usually required kludges (like grafting two display signals together) to work.

It's also rather amusing to see complaints that Apple is 'back' to the walled garden when its software is the most open it's ever been. You can get Apple Music and Apple TV+ on a whole host of devices ranging from Android hardware to smart speakers. Smart TVs not only support those apps, but platforms like AirPlay 2 and HomeKit. You can even use third-party routers for secure HomeKit features, and HomePods support Spotify streaming. Yes, devices like the Apple Watch and HomePod are clearly meant to reward those who invest fully into the Apple ecosystem, but I find it baffling that you'd claim that Jobs' era was somehow more open.
 
I wish folks would stop parroting "Steve Jobs was divine perfection" myths.

I have no idea why you think I even remotely said anything like that. I in fact despise Steve Jobs, he was a supreme asshole to his employees during the development of the original Macintosh, something his coworkers extensively documented. He was also a really sleazy businessman, willing to throw anyone under the bus for his goals, irregardless of the personal cost to people he screwed over.

The fact is that Jobs saved Apple from bankruptcy. He was responsible for cutting all of the unprofitable aspects of the business and focusing it on something that would make the business profitable, like the iMac G3, which was wildly successful.

Many of the trends you're complaining about now started during his golden age. Who do you think put so much trust in Jony Ive and was happy to go with ever slimmer designs that had reduced ports and expandability? Remember, the original MacBook Air came out in 2008.

And they also had the Macbook Pro in 2008. The Macbook Air was designed to be thin and light and stuff had to be compromised. The Macbook Pro was a fully featured laptop and had plenty of expandability. The trend of integration on their laptops did continue after his death, but you can't blame that on a guy that's been dead for 9 years. It's not like they exhume his corpse every year to get a blessing on some new product they want to release.

It's like saying Asus is entirely crap because the eee pc netbook doesn't have any expandability when they have gamer laptops that everything can be changed in.

Conversely, I'd say that Apple has had a few breakthroughs since 2011. The Apple Watch was the first smartwatch to really resonate with the public. AirPods and the wireless chips that have driven them helped Bluetooth audio take off. Even the 5K iMac is a notable innovation, since until then greater-than-4K displays were both unattainable for non-pros and usually required kludges (like grafting two display signals together) to work.

The Apple Watch is a piece of braindead junk that requires a several hundred dollar phone to fully work, unless you're willing to pay hundreds of dollars more for a special version which can function better without the phone. And only Apple could get iZombies to pay $100+ for wireless earbuds. Bluetooth has been around since the early 2000s and was doing just fine before Airpods came along. I'll give them the 5k screen, but only because of the screen. The rest of the machine is junk because it's literally glued shut, preventing any user maintenance or CPU upgrades without risking destroying the screen to do so.

but I find it baffling that you'd claim that Jobs' era was somehow more open.

Job's era was overall far more open in terms of upgradeability on their desktop and some laptop machines, whether he intended it or not. Up until the original cheese grater Mac Pro, you could use mostly industry standard parts for upgrades. It wasn't until later that Apple started pulling in the reigns so they held all of the cards.

And just because you can use your air pods with an android phone does not at all make Apple open. Open in my definition is not obfuscating their designs, using proprietary parts only they can get and spending millions of dollars trying to stifle user rights. They literally have lobbying groups around the country doing whatever possible to block consumer right to repair and consumer rights in general. They don't want a second hand market of their products, and they don't want anyone else selling their products but themselves. This behavior didn't start until after Jobs died, you can't say he had something to do with it.

When you buy Apple, you support basically the most anti-consumer corporation in the world.
 
Apple is migrating all its Macs to ARM, not to AMD Zen2/3 whatever.

Amazon is migrating all its internal workloads to ARM, not to AMD Rome.

#1 in Green500 list is an ARM HPC build, not some AMD Rome + GPU build.
Yessss, and soon, x86-64 will be gone forever, mwahahaha!!! :LOL:
 
I have no idea why you think I even remotely said anything like that. I in fact despise Steve Jobs, he was a supreme asshole to his employees during the development of the original Macintosh, something his coworkers extensively documented. He was also a really sleazy businessman, willing to throw anyone under the bus for his goals, irregardless of the personal cost to people he screwed over.

The fact is that Jobs saved Apple from bankruptcy. He was responsible for cutting all of the unprofitable aspects of the business and focusing it on something that would make the business profitable, like the iMac G3, which was wildly successful.

And they also had the Macbook Pro in 2008. The Macbook Air was designed to be thin and light and stuff had to be compromised. The Macbook Pro was a fully featured laptop and had plenty of expandability. The trend of integration on their laptops did continue after his death, but you can't blame that on a guy that's been dead for 9 years. It's not like they exhume his corpse every year to get a blessing on some new product they want to release.

It's like saying Asus is entirely crap because the eee pc netbook doesn't have any expandability when they have gamer laptops that everything can be changed in.


The Apple Watch is a piece of braindead junk that requires a several hundred dollar phone to fully work, unless you're willing to pay hundreds of dollars more for a special version which can function better without the phone. And only Apple could get iZombies to pay $100+ for wireless earbuds. Bluetooth has been around since the early 2000s and was doing just fine before Airpods came along. I'll give them the 5k screen, but only because of the screen. The rest of the machine is junk because it's literally glued shut, preventing any user maintenance or CPU upgrades without risking destroying the screen to do so.

Job's era was overall far more open in terms of upgradeability on their desktop and some laptop machines, whether he intended it or not. Up until the original cheese grater Mac Pro, you could use mostly industry standard parts for upgrades. It wasn't until later that Apple started pulling in the reigns so they held all of the cards.

And just because you can use your air pods with an android phone does not at all make Apple open. Open in my definition is not obfuscating their designs, using proprietary parts only they can get and spending millions of dollars trying to stifle user rights. They literally have lobbying groups around the country doing whatever possible to block consumer right to repair and consumer rights in general. They don't want a second hand market of their products, and they don't want anyone else selling their products but themselves. This behavior didn't start until after Jobs died, you can't say he had something to do with it.

When you buy Apple, you support basically the most anti-consumer corporation in the world.

Ah, I see the issue: you're one of those types who lives on exaggerated stereotypes of Apple, including products he generally hasn't used and knows little about. Jobs was problematic, to put it mildly, but the truth is that he's somewhere between the hagiographic and demonized versions. He was a pain in the ass and made serious product mistakes (G4 Cube and iPod Hi-Fi, anyone?), but he also had a knack for predicting where the industry was going and thinking about more than raw specs.

User repairs are nice and sometimes important, but they're not the be-all, end-all of a product's worth... you seem to be obsessed with that. If you really think the Apple Watch is "braindead junk..." you've definitely never used a smartwatch outside of a store demo. Just in my own experience it's been useful for fitness, payments, calls, music, the smart home... And AirPods? If you really think they're just another set of Bluetooth earbuds, you've kinda missed the point. The audio quality is mediocre, but they made the technology easy and reliable enough that people who'd never considered wireless audio could easily get them set up and running.
 
Ah, I see the issue: you're one of those types who lives on exaggerated stereotypes of Apple, including products he generally hasn't used and knows little about.

Great ad homineim there. You'd make an excellent politician.

User repairs are nice and sometimes important, but they're not the be-all, end-all of a product's worth... you seem to be obsessed with that.

Everyone should be "obsessed" with consumer's rights, but you're free to support a terrible corporate entity that only cares about you as long as you have money in your hand. They're also the be-all and end-all regarding Apple, because they can't seem to be able to release a product without serious design faults. Pretty much every Macbook ever released has had at least one major design fault, and many of those had to have Apple forced to fix them via the courts because they downplay the significance of every single one of them. Nvidia GPU faults? "A small number of users may have experienced problems", butterfly keyboards? "A small number of users may have experienced problems", hell just type the search term in Google to find years of it.

If you really think the Apple Watch is "braindead junk..." you've definitely never used a smartwatch outside of a store demo. Just in my own experience it's been useful for fitness, payments, calls, music, the smart home... And AirPods? If you really think they're just another set of Bluetooth earbuds, you've kinda missed the point. The audio quality is mediocre, but they made the technology easy and reliable enough that people who'd never considered wireless audio could easily get them set up and running.

And you completely missed the point where the watch requires being teathered to an iPhone for the majority of those features to work, or a wifi hotspot to add a few more and those aren't everywhere, hence braindead. Also, by your own admission, the Airpods aren't that great. Why pay over $100 for them when you can buy generic wireless ear phones that sound the same for 1/5 the price? iZombies. Bluetooth is not that complicated, Apple does nothing to make pairing devices easier. You can't make the process of pairing earbuds much simpler than turning them on and pairing them with your phone.

Apple is like owning a Bentley or a Maserati, it's a status symbol. Their products haven't been revolutionary for a very long time, and they sure aren't reliable.
 
<Yawn>...wake me up when MacOS has enough marketshare to actually matter what they put in their machines.

I'd move full time to Linux before I moved full time to MacOS.

Jesus give it a rest, Apple hold the majority market share in audio production, photography etc. just because that isn’t you, doesn’t mean they don’t exist and aren't important to anyone who isn’t a Windows drone.

As for everyone else, this thread is to discuss Apples move to ARM, not what anyone thinks about Apple because personally we don’t give a shit what you think of them, you aren’t that important.
 
Even then, Apple will probably hard core some key in their CPUs that require only their CPU to boot MacOS. Someone will eventually crack the key but it would mean until then you would have to get a salvage CPU out of someone's old tower.

"ourhardworkbythese......" :p

As long as I can still find somthing that runs snow leopard, apple can migrate to flip phone CPU's and the TempleOS kernel for all I care.
 
As long as I can still find somthing that runs snow leopard, apple can migrate to flip phone CPU's and the TempleOS kernel for all I care.

So you run an OS that hasn't had a security fix in 6 years? Is your other OS Windows XP?
 
Great ad homineim there. You'd make an excellent politician.

Well, given that you are relying on stereotypes and an obvious lack of experience, that's not an ad hominem... that's just a reasonable observation based on evidence. Unless you've actually been using Macs for several years or more and just don't want to admit it.

Everyone should be "obsessed" with consumer's rights, but you're free to support a terrible corporate entity that only cares about you as long as you have money in your hand. They're also the be-all and end-all regarding Apple, because they can't seem to be able to release a product without serious design faults. Pretty much every Macbook ever released has had at least one major design fault, and many of those had to have Apple forced to fix them via the courts because they downplay the significance of every single one of them. Nvidia GPU faults? "A small number of users may have experienced problems", butterfly keyboards? "A small number of users may have experienced problems", hell just type the search term in Google to find years of it.

Pssst, here's a secret: virtually every tech company only cares about you as long as there's money in your hand. What, do you think the company that makes your Android phone or Windows laptop is acting out of the goodness of their heart? They'd screw you over for peanuts. If they decide to let you upgrade RAM or storage yourself, it's not because they deeply care about you, it's because they decided that the math worked out in favor of that for them.

Also, to be blunt, it's clear that you neither pay attention to the Mac scene beyond superficial headlines nor notice what's happening on the Windows side. If you only look at headlines, as is evident here, you'd be convinced that every MacBook was a disaster. But the boring reality is that most issues only ever affect a small portion of systems. Yeah, the butterfly keyboard sucked, but issues like display coatings or hinges? Most people wouldn't even realize there was a problem.

Conversely, there are many Windows laptops that have problems. Dell XPS systems are having problems right now. Microsoft's Surface Laptop 3 had a problem with the display glass cracking without warning. And I can point to many anecdotal experiences (both for myself and friends) of Windows PCs having seemingly systemic problems that make Macs look like bastions of reliability. The differences come down to expectations and the nature of the market. You expect lower-priced PCs, with poorer build quality and part choices, to fail more often. And it's harder to pinpoint larger quality issues at a Windows PC maker when its systems are both less prominent in the market and split across a much wider model variety. Dell's G5 gaming laptop has had reliability issues, for example (one friend had to get hers repaired and replaced multiple times), but you don't hear much about that because it's one of dozens of Dell models and it's cheap.

And you completely missed the point where the watch requires being teathered to an iPhone for the majority of those features to work, or a wifi hotspot to add a few more and those aren't everywhere, hence braindead. Also, by your own admission, the Airpods aren't that great. Why pay over $100 for them when you can buy generic wireless ear phones that sound the same for 1/5 the price? iZombies. Bluetooth is not that complicated, Apple does nothing to make pairing devices easier. You can't make the process of pairing earbuds much simpler than turning them on and pairing them with your phone.

Apple is like owning a Bentley or a Maserati, it's a status symbol. Their products haven't been revolutionary for a very long time, and they sure aren't reliable.

No, I didn't miss those points at all. Calling the Apple Watch "braindead" ignores the value it has even with that necessity of pairing with an iPhone. It's like complaining that a towed trailer is junk because it can't drive itself. No shit, it needs a parent device to work -- but that doesn't change that it adds a lot of value for some people. And on AirPods? Again, you really don't get it. You can get wireless earbuds that sound as good for less, but they won't be as easy to set up and use, won't have hands-free voice control, likely won't have as stable a connection and probably won't have a battery case that offers a whole day's worth of playback. And yes, they are easier to pair and use. It's literally down to "open the AirPods case and tap connect." It's not horribly difficult to pair normal buds, but there's a good reason why companies like Google and Samsung have started copying Apple's approach -- it's just faster and lets even a total neophyte get started without fuss.

Apple is a status symbol, to be sure, but those of us who actually use its products know that it's about more than the logo. It's not perfect and I won't pretend otherwise, but there are real, valid reasons why it's successful. I just wish you'd try its products instead of willfully remaining blind.
 
Well, given that you are relying on stereotypes and an obvious lack of experience, that's not an ad hominem... that's just a reasonable observation based on evidence. Unless you've actually been using Macs for several years or more and just don't want to admit it.

Your choice of ignorance of publicly available information does not make your statements any less ad hominem. You don't know me, and you certainly don't know my experiences, you have no grounds for saying what I do or do not know.

Pssst, here's a secret: virtually every tech company only cares about you as long as there's money in your hand. What, do you think the company that makes your Android phone or Windows laptop is acting out of the goodness of their heart? They'd screw you over for peanuts. If they decide to let you upgrade RAM or storage yourself, it's not because they deeply care about you, it's because they decided that the math worked out in favor of that for them.

What a dystopian view of the world, you've obviously been using Apple devices far too long if you expect every tech company to mimic their abhorrent behavior. There are quite a few companies which care about the user beyond the initial sale, and not because uncle sam forces them to do so.

Also, to be blunt, it's clear that you neither pay attention to the Mac scene beyond superficial headlines nor notice what's happening on the Windows side. If you only look at headlines, as is evident here, you'd be convinced that every MacBook was a disaster. But the boring reality is that most issues only ever affect a small portion of systems. Yeah, the butterfly keyboard sucked, but issues like display coatings or hinges? Most people wouldn't even realize there was a problem.

More ad hominem nonsense. If the boring reality was as Apple has said "a small number of systems only experience issues", and you seem to agree with, there wouldn't be unauthorized repair centers across the country flooded with Apple products in perpetuity. There also wouldn't be class action lawsuits brought against Apple for known design faults almost yearly that they've downplayed and covered up. Louis Rossmann out of NY has excellent examples of Apple design faults going back at least a decade.

Conversely, there are many Windows laptops that have problems. Dell XPS systems are having problems right now. Microsoft's Surface Laptop 3 had a problem with the display glass cracking without warning. And I can point to many anecdotal experiences (both for myself and friends) of Windows PCs having seemingly systemic problems that make Macs look like bastions of reliability. The differences come down to expectations and the nature of the market. You expect lower-priced PCs, with poorer build quality and part choices, to fail more often. And it's harder to pinpoint larger quality issues at a Windows PC maker when its systems are both less prominent in the market and split across a much wider model variety. Dell's G5 gaming laptop has had reliability issues, for example (one friend had to get hers repaired and replaced multiple times), but you don't hear much about that because it's one of dozens of Dell models and it's cheap.

Nobody said that Windows machines are fault free, but there is a big difference between Dell and Apple. Dell subcontracts their designs out to 3rd parties like Pegatron, MSI, Gigabyte, etc. Apple has their own in-house engineers and controls the design process to manufacturing to retail, it's all done in their own factories. Foxconn builds Apple factories to their specifications and is under corporate micromanagement directly from Apple. Dell, HP, etc. does not have that kind of control over their product lines. Apple also tends to base their newer designs on tech from previous designs, which normally means that problems from previous designs are rectified in subsequent designs, not the other way around. Dell and other Windows computer retailers experience more issues because they subcontract out their machine specs to third parties.

When you pay $1000+ for a Macbook, there is the expectation that it will last for a very long time and you'll be treated right if you run into problems. The same is not so for a $300 Dell laptop, people buy them because they're cheap and don't have the same level of expectations as if they were buying a premium product. Apple has no excuses for glaring problems with their products with the amount of control they have over their supply chain. I do electronic device repair for a living, and I see far more Apple products with stupid problems unrelated to user faults than I do Windows machines.

No, I didn't miss those points at all. Calling the Apple Watch "braindead" ignores the value it has even with that necessity of pairing with an iPhone. It's like complaining that a towed trailer is junk because it can't drive itself. No shit, it needs a parent device to work -- but that doesn't change that it adds a lot of value for some people.

Your trailer analogy doesn't really work. A trailer can do hundreds or thousands of things without being attached to a pulling device, an Apple Watch can't do many things without an iPhone. But beauty is in the eye of the beholder.

And on AirPods? Again, you really don't get it. You can get wireless earbuds that sound as good for less, but they won't be as easy to set up and use, won't have hands-free voice control, likely won't have as stable a connection and probably won't have a battery case that offers a whole day's worth of playback. And yes, they are easier to pair and use. It's literally down to "open the AirPods case and tap connect." It's not horribly difficult to pair normal buds, but there's a good reason why companies like Google and Samsung have started copying Apple's approach -- it's just faster and lets even a total neophyte get started without fuss.

That sounds a lot like TOZO earbuds https://www.amazon.com/dp/B07J2Z5DBM which have very similar specs for less than half the cost. Apple is not the be all and end all of making easy to use products.

I just wish you'd try its products instead of willfully remaining blind.

It'd be great if you stopped attacking my character because I have nothing nice to say about a brand you have loyalty to, but it doesn't look like that will ever happen.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
My wallet will for sure be a little fatter because I'll never buy an Apple product when I can get a comparable device for half or less...
 
My wallet will for sure be a little fatter because I'll never buy an Apple product when I can get a comparable device for half or less...

Let us know when you find an Intel processor that uses 1/4 the power of current ones... ARM and Intel are not even remotely comparable.
 
If you can make one reliable prediction about Apple's behavior on hardware built in-house, assume they will err toward control freakery. I'll be interested to see when someone gets Linux running on an ARM Mac without resorting to animal sacrifice.
I'll take it with animal sacrifice, never liked cats much anyways. I'll call it a win-win.
 
Everyone is missing the forest for the trees.

An ARM processor doesn't need to outperform X86 for tasks like 8k video rendering; Apple has devoted significant resources to FPGA development and is testing the water for FPGA accelerators in the Mac Pro. We use FPGAs at work for signal processing and a $50 FPGA is faster than a 32 core threadripper.

I would not be at all surprised to see power efficient and fast ARM processors as the main CPU, with FPGAs and other specific types of coprocessors to accelerate specific tasks. It makes a lot of sense.
 
Everyone is missing the forest for the trees.

An ARM processor doesn't need to outperform X86 for tasks like 8k video rendering; Apple has devoted significant resources to FPGA development and is testing the water for FPGA accelerators in the Mac Pro. We use FPGAs at work for signal processing and a $50 FPGA is faster than a 32 core threadripper.

I would not be at all surprised to see power efficient and fast ARM processors as the main CPU, with FPGAs and other specific types of coprocessors to accelerate specific tasks. It makes a lot of sense.

How many different FPGA's and co-processors would they need?

The beauty of a general purpose CPU is that it can do everything and you don't need a specialized piece of silicon for every little task.
 
How many different FPGA's and co-processors would they need?

The beauty of a general purpose CPU is that it can do everything and you don't need a specialized piece of silicon for every little task.

Oh I forgot. Apple's entire business model is based around selling over-priced unupgradeable junk that has to be constantly replaced.

A new video codec/resolution/bitdepth becomes popular requiring a new rendering coporcessor? Oh well, guess you have to buy another $3,000 computer.

In retrospect it sounds right up Apple's alley.
 
How many different FPGA's and co-processors would they need?

The beauty of a general purpose CPU is that it can do everything and you don't need a specialized piece of silicon for every little task.

Not really, x86 is a jack of all trades, master of none. Just because it can do something, doesn’t mean it is even remotely efficient relative to power consumption, thermal output etc.

Hence why the Metal API is so important, it allows for processing tasks to be carried out by T2, Afterburner, Intel chips, Apple chips, Intel GPUs and AMD GPUs under a single framework. No OpenCL, Vulkan, CUDA, bla bla bla

Oh I forgot. Apple's entire business model is based around selling over-priced unupgradeable junk that has to be constantly replaced.

A new video codec/resolution/bitdepth becomes popular requiring a new rendering coporcessor? Oh well, guess you have to buy another $3,000 computer.

In retrospect it sounds right up Apple's alley.

As opposed to dropping 3k on a stupid threadripper CPU because devs are too lazy to rewrite their code for more optimised processing.
 
Oh I forgot. Apple's entire business model is based around selling over-priced unupgradeable junk that has to be constantly replaced.

A new video codec/resolution/bitdepth becomes popular requiring a new rendering coporcessor? Oh well, guess you have to buy another $3,000 computer.

In retrospect it sounds right up Apple's alley.

That's not how FPGAs work, FPGAs are incredibly versatile. In fact, the entire point of them is completely reconfigurable hardware at the GATE level. They are massively parallel architectures and are far superior to X86 at anything related to signal processing, transcoding, etc. The same FPGA could be flashed to transcode video or generate rasters from FFTs. No hardware changes required. Think of them like a flexible ASIC.

FPGAs generally trade a little bit of performance vs. a full on dedicated ASIC for the flexibility to be reprogrammed on the fly for any task. They are not suitable for every task, which is why it's nice to have a main CPU for certain things. But for the tasks they are suited (high bandwidth massively parallel applications) they are insanely powerful.
 
That's not how FPGAs work, FPGAs are incredibly versatile. In fact, the entire point of them is completely reconfigurable hardware at the GATE level. They are massively parallel architectures and are far superior to X86 at anything related to signal processing, transcoding, etc. The same FPGA could be flashed to transcode video or generate rasters from FFTs. No hardware changes required. Think of them like a flexible ASIC.
If they are massively programmable how come I need Turing for the latest and greatest NVEC encoding? GPUs for transcoding videos are only JUST starting to get good but only with Turing.. CPUs rule the roost as far as quality to file size ratio, AND it doesn't matter what CPU you got either. All recent CPUs can transcode x264, x256 and get the same quality of results (just not speed).
 
If they are massively programmable how come I need Turing for the latest and greatest NVEC encoding? GPUs for transcoding videos are only JUST starting to get good but only with Turing.. CPUs rule the roost as far as quality to file size ratio, AND it doesn't matter what CPU you got either. All recent CPUs can transcode x264, x256 and get the same quality of results (just not speed).

Pretty much the only reason FPGAs are not more widespread is because it is very difficult to find people who are truly skilled at programming them. Verilog and VHDL are basically black arts at this point outside of the defense world and signal processing.

However, the upside to it is that if you do find someone, you're set. I have a headphone DAC/amp that runs on a Xilinx Spartan 7 that a competent engineer could program to be faster than my 9900K for transcoding.
 
Here's something that may help you understand the scope of how flexible FPGAs are:

Let's say someone is arguing about an ARM processor vs. a FPGA for a specific task. I could take a FPGA and literally build an ARM processor inside the FPGA by reconfiguring it's gates. It's done all the time. You can literally take ARM IP and build a full on Cortex core inside of a FPGA.
 
If they are massively programmable how come I need Turing for the latest and greatest NVEC encoding? GPUs for transcoding videos are only JUST starting to get good but only with Turing.. CPUs rule the roost as far as quality to file size ratio, AND it doesn't matter what CPU you got either. All recent CPUs can transcode x264, x256 and get the same quality of results (just not speed).

Because software developers are FARKING LAZY. They will only program for what they know and until they die or get replaced with a new generation of programmers, they will do the same old crap.

It takes someone like Apple or similar to say recode your shit or get out of our ecosystem for them to get motivated enough.
 
Pretty much the only reason FPGAs are not more widespread is because it is very difficult to find people who are truly skilled at programming them. Verilog and VHDL are basically black arts at this point outside of the defense world and signal processing.

However, the upside to it is that if you do find someone, you're set. I have a headphone DAC/amp that runs on a Xilinx Spartan 7 that a competent engineer could program to be faster than my 9900K for transcoding.
Here's something that may help you understand the scope of how flexible FPGAs are:

Let's say someone is arguing about an ARM processor vs. a FPGA for a specific task. I could take a FPGA and literally build an ARM processor inside the FPGA by reconfiguring it's gates. It's done all the time. You can literally take ARM IP and build a full on Cortex core inside of a FPGA.
"Could, if, could, if, potential, could, would, should, if, could..."
 
  • Like
Reactions: Halon
like this
"Could, if, could, if, potential, could, would, should, if, could..."

Only if you don't know what you're talking about. FPGAs are used everyday in PCs and Macs to deliver performance far beyond what you will ever use if you're some 1337 gamer or home video editor. Apple just happens to be the most mainstream company to market (an admittedly overpriced) FPGA acceleration in their Afterburner card. But for actual professional use, FPGAs are everywhere. They run the internet, they run your 4G and 5G towers, they run neural networks with 1ms inference latency, they run basically any high end, high-throughput application where the volumes aren't high enough for ASICS. Intel even purchased Altera specifically because FPGAs are the future. Tesla's self driving technology is based around FPGA accelerators. You just aren't knowledgeable enough to know where to look, or don't have a truly heavy workload that actually requires a FPGA.

Arguing that FPGAs are "could of, would of, should of" is like arguing, "it's too bad the wheel never took off, it could have been a great technology!"
 
Last edited:
Apple just happens to be the most mainstream company to market (an admittedly overpriced) FPGA acceleration in their Afterburner card. But for actual professional use, FPGAs are everywhere.
Just as a 'slight' correction the Afterburner card is very competitively priced. It requires knowledge of the industry.
For a comparison, here is RED's card for roughly the same purpose: https://www.bhphotovideo.com/c/product/1347480-REG/red_digital_cinema_775_0005_red_rocket_x.html
Apple is 1/3rd the cost ($2000) and is capable of processing 6x 8k ProRes RAW streams or 23x 4k ProRes RAW streams simultaneously. The RED Rocket card isn't even capable of doing more than one 6k RED Raw streams in real time.

It's basically more cost and time effective to import REDCODE RAW and have it transcoded to ProRes RAW using the Afterburner card if you want to edit in full resolution (like say 8k, considering the RED Rocket doesn't even support that) and for whatever reason you don't want to use proxies rather than use the abysmally slow and 3 times as expensive RED Rocket card.

$2000 only seems like a lot until you consider that the people using these cards are working on multi-million dollar movies. But regardless, considering the performance and the fact that this level of video encoding and decoding wouldn't be possible with an $8000 PC, it's more than reasonable.
 
Last edited:
Only if you don't know what you're talking about. FPGAs are used everyday in PCs and Macs to deliver performance far beyond what you will ever use if you're some 1337 gamer or home video editor. Apple just happens to be the most mainstream company to market (an admittedly overpriced) FPGA acceleration in their Afterburner card. But for actual professional use, FPGAs are everywhere. They run the internet, they run your 4G and 5G towers, they run neural networks with 1ms inference latency, they run basically any high end, high-throughput application where the volumes aren't high enough for ASICS. Intel even purchased Altera specifically because FPGAs are the future. Tesla's self driving technology is based around FPGA accelerators. You just aren't knowledgeable enough to know where to look, or don't have a truly heavy workload that actually requires a FPGA.

Arguing that FPGAs are "could of, would of, should of" is like arguing, "it's too bad the wheel never took off, it could have been a great technology!"
Yes, things exist, bravo! They could be used everywhere, but, alas, there are reasons they aren't.
 
Just as a 'slight' correction the Afterburner card is very competitively priced. It requires knowledge of the industry.
For a comparison, here is RED's card for roughly the same purpose: https://www.bhphotovideo.com/c/product/1347480-REG/red_digital_cinema_775_0005_red_rocket_x.html
Apple is 1/3rd the cost ($2000) and is capable of processing 6x 8k ProRes RAW streams or 23x 4k ProRes RAW streams simultaneously. The RED Rocket card isn't even capable of doing more than one 6k RED Raw streams in real time.

It's basically more cost and time effective to import REDCODE RAW and have it transcoded to ProRes RAW using the Afterburner card if you want to edit in full resolution (like say 8k, considering the RED Rocket doesn't even support that) and for whatever reason you don't want to use proxies rather than use the abysmally slow and 3 times as expensive RED Rocket card.

$2000 only seems like a lot until you consider that the people using these cards are working on multi-million dollar movies. But regardless, considering the performance and the fact that this level of video encoding and decoding wouldn't be possible with an $8000 PC, it's more than reasonable.

and the funny part is Apple lock you in even less than PCs when it comes to individual software levering processing resources... FCPx rendering for example can be carried out by CPU / GPU / Afterburner and even T2 in some cases not requiring specific hardware other than a mac of course because god forbid they have their own ecosystem...
 
Only if you don't know what you're talking about. FPGAs are used everyday in PCs and Macs to deliver performance far beyond what you will ever use if you're some 1337 gamer or home video editor. Apple just happens to be the most mainstream company to market (an admittedly overpriced) FPGA acceleration in their Afterburner card. But for actual professional use, FPGAs are everywhere. They run the internet, they run your 4G and 5G towers, they run neural networks with 1ms inference latency, they run basically any high end, high-throughput application where the volumes aren't high enough for ASICS. Intel even purchased Altera specifically because FPGAs are the future. Tesla's self driving technology is based around FPGA accelerators. You just aren't knowledgeable enough to know where to look, or don't have a truly heavy workload that actually requires a FPGA.

Arguing that FPGAs are "could of, would of, should of" is like arguing, "it's too bad the wheel never took off, it could have been a great technology!"

Well, wheels are literally used everywhere and are used in lots of things both cheap and expensive, consumer and enterprise, from hotwheels to the biggest dump trucks.
I only hear "expensive" when you talk about FPGAs.
I thought we were talking about GPUs?
Lots of "could" in your post.
Hardware is nothing without software.
 
There are millions of FPGAs being used in devices all over the world. There is nothing "could" about it. For example, very time you make a phone call, your data goes through multiple FPGAs. Every time you access the internet, your bits are going through FPGAs. When you listen to a high sample rate recording, it was probably processed through a FPGA somewhere along the line. When you put your car into self driving mode (or even radar based cruise control), you are relying on a FPGA. The reason people use $100 FPGAs instead of $5000 threadripper computers in these applications is because they are cheaper and faster.

The bottom line - my original point - is that arguing about X86 performance vs. ARM is kind of dumb when Apple is already shipping accelerator devices that destroy X86 in certain applications. A 100mhz ARM processor with a FPGA Afterburner card would annihilate a $10,000 PC for video transcoding.
 
There are millions of FPGAs being used in devices all over the world. There is nothing "could" about it. For example, very time you make a phone call, your data goes through multiple FPGAs. Every time you access the internet, your bits are going through FPGAs. When you listen to a high sample rate recording, it was probably processed through a FPGA somewhere along the line. When you put your car into self driving mode (or even radar based cruise control), you are relying on a FPGA. The reason people use $100 FPGAs instead of $5000 threadripper computers in these applications is because they are cheaper and faster.

The bottom line - my original point - is that arguing about X86 performance vs. ARM is kind of dumb when Apple is already shipping accelerator devices that destroy X86 in certain applications. A 100mhz ARM processor with a FPGA Afterburner card would annihilate a $10,000 PC for video transcoding.

x86 devices are also replete with FPGAs. Nobody (sane) is doing video decoding without offloading the work to their GPU's FPGA block dedicated to that logic. But in addition to being difficult to program, FPGAs are frequently quite limited. As a pedestrian example, my elderly Roku can feed 1080p data to a TV all day long in a form factor comparable to a USB stick. But it maxes out at high profile H.264 v4.1 and won't nudge a bit further. If I try to feed it a video stream encoded at a higher version number to take advantage of extra features, H.265, VC-1 (I'm dating myself, yes), Ogg video, or anything else, it'll require a transcode to take advantage of that dedicated logic. I'm not saying ARM Macs won't offload computation to an array of FPGAs in an effective way, and it's definitely the efficient way to go about it, but Apple hasn't got a monopoly on FPGA implementation. And pooh-poohing powerful SMP builds for handling flexible workloads seems like a weird stance to take, particularly on the [H]ardForum.
 
That's not how FPGAs work, FPGAs are incredibly versatile. In fact, the entire point of them is completely reconfigurable hardware at the GATE level. They are massively parallel architectures and are far superior to X86 at anything related to signal processing, transcoding, etc. The same FPGA could be flashed to transcode video or generate rasters from FFTs. No hardware changes required. Think of them like a flexible ASIC.

FPGAs generally trade a little bit of performance vs. a full on dedicated ASIC for the flexibility to be reprogrammed on the fly for any task. They are not suitable for every task, which is why it's nice to have a main CPU for certain things. But for the tasks they are suited (high bandwidth massively parallel applications) they are insanely powerful.

I'm well aware that they are reprogrammable.

In my (albeit limited, I am not an electrical engineer) experience though, more often than not, when the time comes to expand functionality by reprogramming an FPGA the existing FPGA is no longer capable enough to do the task that you'd really want it to. You now need Xilinx latest and greatest model, requiring yet another board respin :p

I also think you greatly overstate the efficiency of FPGA's. Sure, you can perform better by designing a very special purpose design into your FPGA, like a dedicated VP9 encoder. It will be MUCH less efficient than the same dedicated design would be committed to silicon, but it will still be faster than a general purpose CPU at the task of encoding VP9.

Sure, you COULD design an entire cortex-like core in an FPGA, but you wouldn't want to. Your performance would be absymally low, and your power usage WAY too high compared to the static silicon design.
 
I'm well aware that they are reprogrammable.

In my (albeit limited, I am not an electrical engineer) experience though, more often than not, when the time comes to expand functionality by reprogramming an FPGA the existing FPGA is no longer capable enough to do the task that you'd really want it to. You now need Xilinx latest and greatest model, requiring yet another board respin :p

This is generally not the case. While FPGAs do come in speed grades, 90% of what matters is how large they are (how many logic cells). For most things FPGAs are used for, there is really no difference in a Spartan 6 or Spartan 7 with 50k logic cells. And one thing Xilinx is really good about is keeping pin compatible designs within a family. I could remove a 50k logic cell fpga and plop in a 25k logic cell one and it has the exact same BGA layout, requiring no respins.

I also think you greatly overstate the efficiency of FPGA's. Sure, you can perform better by designing a very special purpose design into your FPGA, like a dedicated VP9 encoder. It will be MUCH less efficient than the same dedicated design would be committed to silicon, but it will still be faster than a general purpose CPU at the task of encoding VP9.

This is true - an ASIC will be more efficient than a FPGA which will be more efficient than a general purpose CPU. But for specific tasks FPGAs are just orders of magnitudes faster than CPUs (and ASICS can be much faster than FPGAs!)

Sure, you COULD design an entire cortex-like core in an FPGA, but you wouldn't want to. Your performance would be absymally low, and your power usage WAY too high compared to the static silicon design.

This is more nuanced. Performance is basically identical clock for clock. But where it becomes less practical is when it comes to higher speed arm cores. It is not practical to design a 1ghz arm soft core in a FPGA, there is no point or advantages vs the dedicated silicon. But if you only need a few hundred mhz it makes total sense - it performs similarly as the dedicated silicon chip would, and you save power and cost by having one physical component instead of two. And the FPGA would still have lots of room inside for other tasks such as digital signal processing PLUS could ingest huge amounts of instantaneous bandwidth due to high speed transceivers built in. A CPU literally can't ingest as much samples as a FPGA can, which is a bottleneck to performance.

Soft ARM cores done inside FPGAs are extremely common for those reasons

Anyway, I guess my point is that the entire computer industry (not just Apple) may be moving towards an ARM general purpose CPU aided by specialized ASICS and general purpose FPGAs. That architecture has really promising performance advantages vs. traditional x86 for most things computers are used for.
 
This is generally not the case. While FPGAs do come in speed grades, 90% of what matters is how large they are (how many logic cells). For most things FPGAs are used for, there is really no difference in a Spartan 6 or Spartan 7 with 50k logic cells. And one thing Xilinx is really good about is keeping pin compatible designs within a family. I could remove a 50k logic cell fpga and plop in a 25k logic cell one and it has the exact same BGA layout, requiring no respins.



This is true - an ASIC will be more efficient than a FPGA which will be more efficient than a general purpose CPU. But for specific tasks FPGAs are just orders of magnitudes faster than CPUs (and ASICS can be much faster than FPGAs!)



This is more nuanced. Performance is basically identical clock for clock. But where it becomes less practical is when it comes to higher speed arm cores. It is not practical to design a 1ghz arm soft core in a FPGA, there is no point or advantages vs the dedicated silicon. But if you only need a few hundred mhz it makes total sense - it performs similarly as the dedicated silicon chip would, and you save power and cost by having one physical component instead of two. And the FPGA would still have lots of room inside for other tasks such as digital signal processing PLUS could ingest huge amounts of instantaneous bandwidth due to high speed transceivers built in. A CPU literally can't ingest as much samples as a FPGA can, which is a bottleneck to performance.

Soft ARM cores done inside FPGAs are extremely common for those reasons

Anyway, I guess my point is that the entire computer industry (not just Apple) may be moving towards an ARM general purpose CPU aided by specialized ASICS and general purpose FPGAs. That architecture has really promising performance advantages vs. traditional x86 for most things computers are used for.
Sounds very expensive to program for and to keep programming for....
 
Sounds very expensive to program for and to keep programming for....

It is certainly more expensive to find a VHDL or Verilog programmer than it is to find someone with Python/Java/etc. But that's not really a problem for megacorps like Intel/Apple/etc. A good FPGA guy would typically start in the low 100s vs $60-70k for a normal software developer. Very good FPGA guys can command over $200k.
 
Pretty much the only reason FPGAs are not more widespread is because it is very difficult to find people who are truly skilled at programming them. Verilog and VHDL are basically black arts at this point outside of the defense world and signal processing.

However, the upside to it is that if you do find someone, you're set. I have a headphone DAC/amp that runs on a Xilinx Spartan 7 that a competent engineer could program to be faster than my 9900K for transcoding.
Well I mean HDLs are not crazy rare. Any time a company wants to design any logic they are going to use an HDL (verilog or vhdl, +systemverilog or systemc for verification combined with python scripts etc etc). Intel uses them to design their processors, AMD uses them, Nvidia, Qualcomm, Apple, ARM, TI, Marvell, Broadcom, Asmedia, NXP, etc etc etc. It's obviously not as common as general software development languages, but I'm just saying every large company that does any sort of logic design is using them and have entire engineering teams that use them.
There are millions of FPGAs being used in devices all over the world. There is nothing "could" about it. For example, very time you make a phone call, your data goes through multiple FPGAs. Every time you access the internet, your bits are going through FPGAs. When you listen to a high sample rate recording, it was probably processed through a FPGA somewhere along the line.
I'll take it one step even further. People are often familiar with high end high performance FPGAs, but they forget that there are very small FPGA that do very basic functionality. When you pick up your phone and simply USE it there is a good chance your phone has something like this:
http://www.latticesemi.com/iCE40
This is a very small basic FPGA and might do basic power sequencing logic, i2c logic, spi logic, interrupt control, basic logical signal routing.


x86 devices are also replete with FPGAs. Nobody (sane) is doing video decoding without offloading the work to their GPU's FPGA block dedicated to that logic. But in addition to being difficult to program, FPGAs are frequently quite limited. As a pedestrian example, my elderly Roku can feed 1080p data to a TV all day long in a form factor comparable to a USB stick. But it maxes out at high profile H.264 v4.1 and won't nudge a bit further. If I try to feed it a video stream encoded at a higher version number to take advantage of extra features, H.265, VC-1 (I'm dating myself, yes), Ogg video, or anything else, it'll require a transcode to take advantage of that dedicated logic. I'm not saying ARM Macs won't offload computation to an array of FPGAs in an effective way, and it's definitely the efficient way to go about it, but Apple hasn't got a monopoly on FPGA implementation. And pooh-poohing powerful SMP builds for handling flexible workloads seems like a weird stance to take, particularly on the [H]ardForum.
I don't know of any GPU that has any onboard programmable logic like an FPGA has. GPUs are usually pretty specialized for their function, using programmable logic would take a lot more silicon space/power that they dont have to spare.

In general I always hesitate to accept the idea of "rise of the FPGA in common computing".

All of the below in the context of FPGA vs purpose design ASIC.
The advantage of the FPGA as general accelerator would be that you have a single block of silicon that you can reconfigure to any application. For a particular application it may be faster than a CPU. However, for any particular application the same logic burned into an ASIC will always be faster/smaller/+efficient (smaller is pretty much guaranteed, faster/efficient would be where you would be making a balance).

So then the question becomes:
How many different applications do you need for your accelerator to handle before an FPGA would make more sense than an ASIC?

1 application? ASIC wins
2 applications? You could probably fit two separate ASIC accelerators into the same silicon area as a single FPGA for the same tasks and ASIC design would still be smaller and faster/+efficient
3 applications?
4 applications?
5?

I dont know, at what point does the FPGA win for general computing acceleration? FPGA logic is not exactly dense, and it's not fast.

There are other contexts where the FPGA wins NOT because its better performing, but because the economics/time makes more sense.
Do you need 1-100 designs with your logic and it runs ok on an FPGA? Then you probably are not going to need to design an ASIC. it will cost enormous amounts $$ to spinup just a few ASIC chips. Go with FPGA
Are you an engineer that is testing /debugging logic that will ultimately target ASIC? Have fun waiting a few months every time you want to want to test a single change. Get an FPGA for testing/debugging.

Nvidia has entire FPGA racks that do nothing but run their logic for their GPU (basically run all the logic of a single 700mm^2 GPU chip in several racks of FPGA at highly reduced speeds.... again... FPGAs are not dense compared to ASIC...)
https://www.cadence.com/en_US/home/...otium-s1-fpga-based-prototyping-platform.html
https://blogs.nvidia.com/blog/2011/05/16/sneak-peak-inside-nvidia-emulation-lab/
 
Back
Top