Softbank selling ARM to Nvidia, finally after year of speculation...

Games I'll give you. That's a really big hurdle to clear. But it's also not a market that can't be cleared: the biggest games are not the most graphically intensive games, even on the desktop, and with Nvidia graphics, I don't see it as impossible. Just not... soon.

What's the practical difference between an ARM Chromebook and a Windows x86 laptop?

To me, it's performance and application availability. The performance is what Nvidia is now in a position to address, and application availability from a strictly consumer perspective is a non-issue. Porting to ARM isn't going to be fast, but it's not going to be that difficult either.
I would agree performance and application availability is the difference and as much as we would like to say nvidia is going to come out with some x86 killer its not going to happen that easily. Looking at apple will give you a good mark of what can be done with a bunch of reasourses and forced software adaptation.

I run x86 because it is massively vast that vastness does not come overnight and it does not come overtime if software is not forced to. It can for x86 as it was the sole practical arch for a very large period of time. It's simply not worth the investment for nvidia when they can throw that money at a server chip and not have to deal with any of that.

And to state my position on personally using a differnt arch I would happily fire up a power 9 rig if I had one and there would be very real things do do with it (including virtulizing x86 if you really want). This is because its a decently powerful chip. If such a arm chip like that existed it could fill a similar enthusiast/enterprise role but that is very far off from a mainstream desktop cpu
 
Here is one for the conspiracy theorists: this buyout will place arm licensing under US jurisdiction. Today, to use an understatement, is not a happy day for the Chinese tech sector.
 
Here is one for the conspiracy theorists: this buyout will place arm licensing under US jurisdiction.

It already was, at least in part.

I'm not entirely sure how, but ARM Holdings was forced to abide by the Trump Administrations export restrictions to China last year.

It confused me when I read it in the news, and I never spent the time to figure out why.
 
I run x86 because it is massively vast that vastness does not come overnight and it does not come overtime if software is not forced to. It can for x86 as it was the sole practical arch for a very large period of time. It's simply not worth the investment for nvidia when they can throw that money at a server chip and not have to deal with any of that.
I do get it. I'm not in a hurry to jump on ARM outside of my phones and Pis either.

But consider just how much of a limitation 'old software support' really is or isn't. If the software is being maintained then there's someone with a codebase that can patch APIs and do a recompile. If there isn't a maintainer, then the software is just as likely to be unoptimized for modern hardware and full of security holes.

And at some point, anything that's abandoned but still necessary can be packaged in a secure container with an emulated or translated ISA if need be.
 
I do get it. I'm not in a hurry to jump on ARM outside of my phones and Pis either.

But consider just how much of a limitation 'old software support' really is or isn't. If the software is being maintained then there's someone with a codebase that can patch APIs and do a recompile. If there isn't a maintainer, then the software is just as likely to be unoptimized for modern hardware and full of security holes.

And at some point, anything that's abandoned but still necessary can be packaged in a secure container with an emulated or translated ISA if need be.

The problem I see is where does the push come from to get developers to jump ship? Its easy for apple as they can command developers around but until there is a decent market chunk devs arnt going to bother. Thats why I think we will see years of arm crome books and tablet like devices before nvidia can even asess if they have the option to enter that market and thats not what they spent 40B for
 
It already was, at least in part.

I'm not entirely sure how, but ARM Holdings was forced to abide by the Trump Administrations export restrictions to China last year.

It confused me when I read it in the news, and I never spent the time to figure out why.

That is not so confusing: operate in US markets, abide by US law.

The advantage to the US in this new situation is that if/when a situation arises that a company is forced to choose between operating in the US and trading with China, a US held company has no options where a foreign held one does. That is not even mentioning the advantages of having the company leadership living nearby should the they decide to, ah, get creative in their interpretation of American laws.
 
Last edited:
It is a crime if they use their leverage after buying ARM to put other ARM licensees who are their competitors at a disadvantage.

That is the very definition of anti-competitive behavior.

SoftBank didn't face this scrutiny because they didn't directly compete with ARM licensees. (At least not that I am aware of)

If Nvidia continues to run ARM like it has been run in the past there is no problem. If they use the fact that they have bought ARM to give themselves and advantage over other ARM licensees in other markets in which they compete, it definitely runs afoul of the Sherman act or the Clayton act (or possibly both)

I don't know if this will be held prior to going through, but if they start using their ownership of ARM Holdings to hurt other entities they compete with as a user of ARM chips I bet it will go to the courts in a hurry.

About the only relevant passage I could find in the above two acts you cited was this:

  • mergers and acquisitions that substantially reduce market competition.

And that can easily be bypassed w/out catching the eye of regulators. Then there's the Celler–Kefauver Act which allows the Federal govt to stop vertical mergers but I don't see the US govt having any issue w/this buyout. So that brings us back to the market competition clause which they can easily get around by just influencing the direction of the market with their IP rather than directly competing. Why compete when you can sell to the market?
 
About the only relevant passage I could find in the above two acts you cited was this:

  • mergers and acquisitions that substantially reduce market competition.

And that can easily be bypassed w/out catching the eye of regulators. Then there's the Celler–Kefauver Act which allows the Federal govt to stop vertical mergers but I don't see the US govt having any issue w/this buyout. So that brings us back to the market competition clause which they can easily get around by just influencing the direction of the market with their IP rather than directly competing. Why compete when you can sell to the market?

I am by no means a legal professional, but it would seem to me that if the DOJ could file charges under Section 2 of the Sherman act because Microsoft used their market dominance to try to take control of the browser market, similar charges could be filed IF Nvidia tries to use their control of ARM Holdings to give themselves an unfair advantage in any other markets.

I do not claim to understand how the above law was interpreted in that case, this is way outside of my area of expertise, but the DOJ and several states did, and won the case.
 
Sometimes companies buy other companies just so someone else doesn't get their hands on it. The "value" of the company is never just measured in what it brings to the table for you but also what it brings to the table by your competition not having it. And, I don't mean just increasing license prices or milking them for money, but in strategic directions of that industry. You have to look around at various other companies and ask what would THEY do with ARM and how would that potentially impact nvidia. I'm sure nvidia's executive was asking that question before pondering the purchase. And, yes, there's also what nvidia themselves want to do with the company as well. There are so many reasons why a purchase might happen that don't appear obvious on its face. So, nvidia being able to steer the ship ... AND ... someone else not steering the ship ... leads to an attractive purchase price.
 
Sometimes companies buy other companies just so someone else doesn't get their hands on it. The "value" of the company is never just measured in what it brings to the table for you but also what it brings to the table by your competition not having it. And, I don't mean just increasing license prices or milking them for money, but in strategic directions of that industry. You have to look around at various other companies and ask what would THEY do with ARM and how would that potentially impact nvidia. I'm sure nvidia's executive was asking that question before pondering the purchase. And, yes, there's also what nvidia themselves want to do with the company as well. There are so many reasons why a purchase might happen that don't appear obvious on its face. So, nvidia being able to steer the ship ... AND ... someone else not steering the ship ... leads to an attractive purchase price.

Sometimes, but unlikely that's the case here. ARM fills in the missing pieces of NVidias end to end computation solution.

NVidia is going to leverage the hell out of ARM. The first thing they plan is to try up-selling GPU and AI licenses to ARM licensees. This isn't speculation, this what they said in an interview. ARM revenues are pretty low, but if NVidia can stack more GPU and AI license fees on top, then they start having a multiplier effect.

The next biggest change they indicated, was accelerating the roadmap. Faster roadmap, means more new designs to license, and those new licenses are where you will probably see some price increases, since old licenses really can't be changed.

Softbank was already reportedly already doing large increases:
https://www.reuters.com/article/us-...-for-some-customers-sources-say-idUSKCN24G1RM

First roadmap push will be servers. I am betting they will be pushing higher license fees and Royalties for the Server market than what is in mobile.

NVidia wasn't worried about anyone else taking over, unless that was AMD, because everyone does everything because they are afraid of AMD these days. ;)

NVidia is now a full stack player. 5 years from now ARM/NVidia designs will be everywhere. ARM Servers will be everywhere. NVidia GPUs will be tagging along in smartphones. NVidia SoCs will be powering Windows ARM laptops.
 
Nvidia purchase of Arm is a 'disaster' for the UK tech industry: Government to probe $40 billion sale of British chip designer after co-founder says it is 'the equivalent to letting Trump getting his hands on Trident'
  • The deal is still subject to regulatory approval including from the UK government
  • Hermann Hauser helped spin Arm off from former parent firm Acorn in 1990
  • He said the government should impose restrictions on the deal to protect jobs
  • Hauser said Arm is the last globally significant European technology company
  • The government says it will review the deal to check for any impact on the UK
  • Nvidia says it will keep the Arm headquarters in the UK and expand operations

The sale of British chip maker Arm to Nvidia for $40 billion (£31 billion) - described as a 'disaster' for the UK tech industry and 'the equivalent to letting Trump getting his hands on Trident', by an Arm co-found - is to be probed by the UK government.

Nvidia confirmed the deal to buy Arm on Monday, just four years after the firm was purchased by Japan's SoftBank Group for $32 billion (£24 billion) in 2016.

Arm is best known as the designer of processor chips used in most major smartphones - including both Apple and Samsung devices.

Hermann Hauser said the sale to the American chip maker would be a 'disaster for Cambridge, the UK and Europe' and see the 'last European technology company with any global relevance sold off to the Americans'.

Hauser, who helped spin Arm off from former parent company Acorn Computers in 1990, has urged the UK government to work to block the sale of the British firm.

However, Nvidia has pledged to keep Arm headquartered in Cambridge, while also promising to expand on Arm's work to build a 'world-class' technology centre.

Prime Minister Boris Johnson is said to be taking a personal interest in the deal for Arm which he described as playing a 'vital role' in the UK tech sector.
 
You can run a desktop on a potato. Not even figuratively speaking, but you really should be able to perform all of the basic tasks that most consumers do with a desktop operating system using the voltage supplied by a potato, powering a modern, efficient architecture.

Well, that's probably a reach, but what's not a reach is that modern 'desktop stuff' just doesn't need a lot. The Raspberry Pi line is plenty evidence of that; the Pi 4 can do pretty much any desktop task, including some gaming, quite well while being extremely cheap and using an architecture (both the various ISAs and the manufacturing process) that is nearly obsolete itself.

Apple has shown what a modern ARM, really modern 'not x86', system can do. And by exercising strict control over their product stack, they've been able to ensure that software and hardware are optimized together alongside a focus on efficiency and security. Imagine how fast desktop computers could be if they received the same focus!
Yeah, I can imagine how restricted I would be on what I can buy and run... overclocking would be a thing of the past. It would be the bestest thing ever!
Seriously, do you really want this in the PC market? Their is a reason the PC market is much larger than Apple and Consoles... lets take that away and build a second apple, that's the solution!
 
  • Hauser said Arm is the last globally significant European technology company
  • The government says it will review the deal to check for any impact on the UK
I think they need to take a closer look at themselves as an economy and laws that affect industry as to why they are not that relevant in tech anymore.
 
Boris Johnson will just talk to Jensen and get assurances that Cambridge will stick around and job growth is real like nvidia promises. He’ll then go public with it and spin it as a good deal he personally helped oversee while Trump pats him on the back for being a good boy. UK let Arm go 4 years ago, crying about it now is silly. In fact they should be happy an American company bought it rather than say Indian, Chinese or Korean because those companies usually have state backing and influence.
 
Similar alarmist story when they sold to Softbank:
https://www.dailymail.co.uk/news/ar...oup-warns-24bn-takeover-benefits-Britain.html

They will make some noise about job guarantees, but it's mostly smoke. UK will approve if they even have much of say in this at all anymore.


Right, the cost of making more-and-more advanced Cell phone chips every year plus more advanced MALI cores (and then funneling that into a server CPU with efficient interconnect) is making it harder for ARM to turn a profit. If you have a down-turn for phones, then you have no other income!

The only thing that will fix that is NVIDIA putting their server know-how behind the company (mass-produced server chips = massive margins, even during economic downturns). I would expect Nvidia will push to be the first licensed ARM Corporation core with 256-bit Scalable Vector Units.

Combine that with their GPU license, an you have an unbeatable combination (and the cell phones will continue to get better off the same basic core improvements, minus the vectors!) You will finally have an easy way to create a backward-comparable Switch 2, just for on e positive result of this!

About the only uncertain piece of the puzzle here is where MALI will end-up? It's totally redundant for a company like Nvidia (but selling it off to another company would mean they create another Adreno). I would expect they would end the redundant GPU research, and just rebrand last-generation GeForce parts as MALI (that way, they could maintain the "give it away, if you buy a CPU" price point of MALI).
 
Last edited:
Seriously, do you really want this in the PC market?
This really depends on how you define 'PC'.

The reality is that if the hardware and software do the work that needs to be done, the IP inside doesn't really matter.

Software has transitioned away from majority closed-source, ISA-specific code to open source frameworks designed to compile on a variety of ISAs. Most of this work has already been done by Apple, Microsoft, and the various major Linux contributors along with the various major end-user software firms and open-source development communities.

If it's consumer facing and runs on OS X or Linux, it probably already runs on ARM. The only barrier is developer incentive to put in the effort.
Yeah, I can imagine how restricted I would be on what I can buy and run... overclocking would be a thing of the past. It would be the bestest thing ever!
Overclocking is already pretty restricted. Where it's possible, you're usually trading a fraction more performance for significant increases in heat output and likely noise. It's nothing like the jumpers of old, or the systems where a bus speed increase of 50% would result in 50% more clockspeed with no other modifications required. Intel has started running their parts at the edge, AMD parts barely benefit from manual overclocking, and GPUs are generally limited by power envelopes, just like mobile devices.

I don't mean to take the magic out of it, but generally speaking, more performance is obtained through better parts, not cranking lower-end products to the max.

If Nvidia is intending to take on x86 with ARM, it's hard to imagine them holding back on stuff like clockspeeds. They'll likely not be holding much back.
 
This really depends on how you define 'PC'.

The reality is that if the hardware and software do the work that needs to be done, the IP inside doesn't really matter.

Software has transitioned away from majority closed-source, ISA-specific code to open source frameworks designed to compile on a variety of ISAs. Most of this work has already been done by Apple, Microsoft, and the various major Linux contributors along with the various major end-user software firms and open-source development communities.

If it's consumer facing and runs on OS X or Linux, it probably already runs on ARM. The only barrier is developer incentive to put in the effort.

Overclocking is already pretty restricted. Where it's possible, you're usually trading a fraction more performance for significant increases in heat output and likely noise. It's nothing like the jumpers of old, or the systems where a bus speed increase of 50% would result in 50% more clockspeed with no other modifications required. Intel has started running their parts at the edge, AMD parts barely benefit from manual overclocking, and GPUs are generally limited by power envelopes, just like mobile devices.

I don't mean to take the magic out of it, but generally speaking, more performance is obtained through better parts, not cranking lower-end products to the max.

If Nvidia is intending to take on x86 with ARM, it's hard to imagine them holding back on stuff like clockspeeds. They'll likely not be holding much back.
I wasn't talking about the ISA or IP. I was specifically talking about this "Apple has shown what a modern ARM, really modern 'not x86', system can do. And by exercising strict control over their product stack, they've been able to ensure that software and hardware are optimized together alongside a focus on efficiency and security. Imagine how fast desktop computers could be if they received the same focus!". Yes, if you lock down a system to where people can't do what they want, control what parts I can put in it, and control every aspect of how I use my computer, sure you can more easily transition between arch's. But I would lose out a lot more than I gain. I'm not against using arm in my desktop, I'm against being completely locked to the whims of a single company/entity. The only reason Apple gets away with it is because they are not the majority. If they were it'd be considered anti-competitive and a monopoly. As long as they don't grow to big, they can and will continue to do this. There is a reason after all these years and followers they are still only a tiny percentage of the desktop space. If nvidia gets into something like this where they build the entire ecosystem around their products, cool, I won't be buying one unless it's open, which nvidia has already shown in the past, they won't be (look back at ANY of their recent or not so recent gifts to the world, like physx or g-sync). Nvidia like proprietary stuff. You talk about all of this software transitioning to open source or w/e, then talk about nvidia bringing a closed ecosystem together. Those two don't go together in any shape or form, which is the worry. It WAS open and inclusive, now a company that is the exact opposite of this thinks it's worth 40billion to buy it. They sure aren't using this to test the waters on moving to open source and sharing ;).
 
I wasn't talking about the ISA or IP. I was specifically talking about this "Apple has shown what a modern ARM, really modern 'not x86', system can do. And by exercising strict control over their product stack, they've been able to ensure that software and hardware are optimized together alongside a focus on efficiency and security. Imagine how fast desktop computers could be if they received the same focus!". Yes, if you lock down a system to where people can't do what they want, control what parts I can put in it, and control every aspect of how I use my computer, sure you can more easily transition between arch's. But I would lose out a lot more than I gain. I'm not against using arm in my desktop, I'm against being completely locked to the whims of a single company/entity. The only reason Apple gets away with it is because they are not the majority. If they were it'd be considered anti-competitive and a monopoly. As long as they don't grow to big, they can and will continue to do this. There is a reason after all these years and followers they are still only a tiny percentage of the desktop space. If nvidia gets into something like this where they build the entire ecosystem around their products, cool, I won't be buying one unless it's open, which nvidia has already shown in the past, they won't be (look back at ANY of their recent or not so recent gifts to the world, like physx or g-sync). Nvidia like proprietary stuff. You talk about all of this software transitioning to open source or w/e, then talk about nvidia bringing a closed ecosystem together. Those two don't go together in any shape or form, which is the worry. It WAS open and inclusive, now a company that is the exact opposite of this thinks it's worth 40billion to buy it. They sure aren't using this to test the waters on moving to open source and sharing ;).
So I was more talking about performance; with respect to openness vs. a closed ecosystem I do see what you're saying. I don't know if Nvidia will be able to exert this level of pressure on the market, but it's certainly something to keep tabs on.
 
So I was more talking about performance; with respect to openness vs. a closed ecosystem I do see what you're saying. I don't know if Nvidia will be able to exert this level of pressure on the market, but it's certainly something to keep tabs on.
Yeah, I should have quoted only the part I was really concerned about, lol. Yeah, closed systems are much easier to keep up with as there is a much more limited scope so things can be more tightly coupled. This is where the inclusion gets thrown out though and one company has all the control. Could you imagine how great it would be if you were forced to only use apps on the windows store and not be allowed to download 3rd party applications without M$ approval? This is regulated here in the US where if it's a general purpose computing device that it's not allowed to lock other apps out (which I think is where the entire epic/apple thing is going, when are phones going to be classified as general computing devices vs. just a phone). I am a glass is half empty kind of guy, I look at motives and possible downsides much more than I look at upsides, because lets be honest, companies aren't your friend, they are trying to gain market share, gain higher margins and make more money (in general, there are very few businesses that exist to actually help people). I don't know how much pressure nvidia can exert or how long it will take, but they obviously feel they can make something happen if they're willing to spend that much cash. They aren't doing it to help all the other arm producers out ;).
 
Yeah, I should have quoted only the part I was really concerned about, lol. Yeah, closed systems are much easier to keep up with as there is a much more limited scope so things can be more tightly coupled. This is where the inclusion gets thrown out though and one company has all the control. Could you imagine how great it would be if you were forced to only use apps on the windows store and not be allowed to download 3rd party applications without M$ approval? This is regulated here in the US where if it's a general purpose computing device that it's not allowed to lock other apps out (which I think is where the entire epic/apple thing is going, when are phones going to be classified as general computing devices vs. just a phone). I am a glass is half empty kind of guy, I look at motives and possible downsides much more than I look at upsides, because lets be honest, companies aren't your friend, they are trying to gain market share, gain higher margins and make more money (in general, there are very few businesses that exist to actually help people). I don't know how much pressure nvidia can exert or how long it will take, but they obviously feel they can make something happen if they're willing to spend that much cash. They aren't doing it to help all the other arm producers out ;).

Personally I think there's a lot of merit to closed systems. People hate on Apple but I've never been disappointed by my iPhones over the years despite them being "closed" and the performance has been great. Sometimes having an open system isn't so wonderful (e.g. Windows and it's myriad of viruses/trojans). I also view the NVIDIA takeover of ARM as a good thing because I think it will speed up innovation for ARM CPU/GPU in the notebook/desktop segment vs had it gone to anyone else. It may raise prices and some of the smaller guys on razor thin margins may get squeezed out but oh well. I guess they can always fall back to RISC-V if that is the case, I'm not a believer of community sourced technology as it always slows things down too much and sometimes never gains traction--just look at Linux on the desktop for example.
 
Personally I think there's a lot of merit to closed systems. People hate on Apple but I've never been disappointed by my iPhones over the years despite them being "closed" and the performance has been great. Sometimes having an open system isn't so wonderful (e.g. Windows and it's myriad of viruses/trojans). I also view the NVIDIA takeover of ARM as a good thing because I think it will speed up innovation for ARM CPU/GPU in the notebook/desktop segment vs had it gone to anyone else. It may raise prices and some of the smaller guys on razor thin margins may get squeezed out but oh well. I guess they can always fall back to RISC-V if that is the case, I'm not a believer of community sourced technology as it always slows things down too much and sometimes never gains traction--just look at Linux on the desktop for example.
Ahh yes, there is something to be said of closed systems. They're closed. Aka, they stifle competition and you end up paying 4x the cost for replacing your RAM because it's Apple ram, lol... yeah, you guys who've been here a while know what I'm talking about. I had to use an iPhone a few times... it was fine and easy to use, my parents used them for a while and eventually moved over to android. I also do development and 100% hate Apple products and how locked down they are. Simple things are more complicated with tons of hoops to jump through and require 2x the effort of what it should. I know i'm probably the minority in this, but did you ever notice how there are a lot of simple devices that have 2 versions... one for everyone else, and one for Apple (which everyone else can also use, but it's more complex)? Just a simple example, something as simple as a bluetooth elm scanner for your car. Oh, that's right, Apple decided there is no use to include the bluetooth serial port protocol since they don't have any devices they sell that use it. So, while Android, PC, etc can all work fine with Bluetooth, they have to make 2 versions, one for bluetooth and one that uses wifi. Funny part is, you can jailbreak your iPhone and install the missing drivers as they are available, but it doesn't help Apple out, so why? And you can't sell a product that requires someone to jailbreak a product to use, so they have to sell 2 versions and add confusion for what reason? Bluetooth SPP doesn't have any known vulnerabilities, so it's not for security.
Windows is a horrible example... it is not open at all. It's a fully closed source OS. Sure they don't make the hardware for their desktops, but neither does Apple (at this time). It's not that Macs don't have issues, it's just why would people waste their time? I wrote an operating system too, it had 0 viruses!!!! w00t, it's the best thing ever. Oh wait, no... nobody ever used it, maybe that's why (this isn't a joke, I actually did write a small operating system, and nobody ever used it because it was just meant for learning). Linux is open, and I agree that things tend to suck for years and not get done. I'm not some open source rules the world, but I do like having open interfaces, like freesync/hdmi vrr for the entire ecosystem to be compatible where I can buy a TV from TCL or Sharp that works with my PS/XBOX and PC. I'm not locked into a specific manufacturer and whatever they feel like charging is what I have to pay since there aren't options.
Is windows perfect? lol, nobody thinks that. Are there issues that arrise from ahving to support millions (billions?) of combinations of hardware? Of course there will be. I'd rather deal with the occasional driver uninstall/reinstall once in a while and have the choice of which GPU I can put in my system, than have a perfectly working system that I can't do what I want with or to. Yes, it will speed up "innovation" at the same time, reducing the amount of companies that can use it, which in turn leads to less innovation over time. It doesn't seem like Apple is having any issues with "innovating" with ARM even though they don't own it (hey, look what I did there, it was one of your examples!!!).
Oh well, we'll see if it passes regulatory scrutiny and if it does, we'll eventually find out how good or bad this is for consumers.
Linux is a good idea on why open source doesn't always work great for everyone, but it sure is installed on more devices than windows and MacOS combined. Everyone feels like they have better ideas and go off in their own way, wasting lots of time and resources re-inventing the same wheel. Linux for desktop won't take off until there is some consolidation, which is very difficult when nobody is really in charge. But it's not stopping anyone from inventing a better mousetrap either, which is how we end up with things like ARM servers, Docker containers, virtual machines, etc. Most of those "innovations" didn't start in a closed ecosystem. I would say a lot more innovation and ideas comes out of small linux/offhshoot projects than from windows or mac (although I don't have any data, just a gut feeling).
 
People hate on Apple but I've never been disappointed by my iPhones over the years despite them being "closed" and the performance has been great.
I run a Pixel because software-wise, that's the next best thing to an iPhone without being Apple. At the same time, the urge to jump ship just for Apple's hardware, mostly battery life and cameras, is sitting in the back of my mind.

I liken Apple devices to appliances. As a PC enthusiast, I find that somewhat repulsive personally, but at the same time it's hard not to admire systems that just do what they're supposed to. I run all kinds of stuff on my desktops, laptops, desktops repurposed as servers... but my phone?

It had better be good at being a phone at the very least!
Sometimes having an open system isn't so wonderful (e.g. Windows and it's myriad of viruses/trojans).
As closed as Microsoft has been from a culture standpoint over the years, they've really been on the receiving end when it comes to being targeted for having such an 'open' operating system.

They've spent quite some time making Windows more efficient and more secure, barring telemetry collection of course, to the point that it's also hard not to give them props for it.
I also view the NVIDIA takeover of ARM as a good thing because I think it will speed up innovation for ARM CPU/GPU in the notebook/desktop segment vs had it gone to anyone else. It may raise prices and some of the smaller guys on razor thin margins may get squeezed out but oh well.
If there's a point to be excited about, to me, it's this. Aside from being anti-competitive, Nvidia's motivation for buying ARM has to involve pushing the architecture faster and further. When the Pi 4 came out, I looked into the technology behind it, and still am somewhat amazed at how far ahead ARM is in terms of architecture design, and just how far behind they are in terms of getting that technology to market.

Nvidia doesn't have that problem, and if they can get in the same ballpark as Apple, well, I consider that a good thing. Give me a laptop that can run for days, or even a full workweek, on a single charge assuming you're not doing something intense?

The only thing I won't accept here is being limited to the Windows Store if said laptop is running Windows. Not that I don't find the Windows Store useful for what it does well (same for other 'stores' and glorified package managers for games and apps and otherwise), but I do want that choice preserved.

I guess they can always fall back to RISC-V if that is the case, I'm not a believer of community sourced technology as it always slows things down too much and sometimes never gains traction
When I look at RISC-V, I have to wonder two big things. First, who's going to optimize this thing for performance? Throw a stock ARM core at a problem, and you'll get piddly results, even compared to Qualcomm let alone Apple. Optimizing this stuff is a herculean undertaking that involves driving whole ecosystems.

Second, who's going to take a chance on actually making them? I can't even get an Atom-class ARM board readily, and not because the processors don't exist.

The only consolation here is that foundries seem to be expanding all over. Perhaps some of that capacity could be allocated to cheaper RISC-V parts, but it's still going to be an uphill effort.
just look at Linux on the desktop for example.
I've been running Linux desktops on and off for twenty years, and I'm still waiting for the year of the Linux desktop. Whereas Apple was able to make a modern, coherent desktop operating system with a BSD core, Linux still feels like Windows 98 in the early days of the internet. So much half-developed crap and disjointed implementations that any deviation from a solid baseline can break the system, and if you're not a near-expert on the particular flavor of the particular distro you're using, well, good luck.
 
I run a Pixel because software-wise, that's the next best thing to an iPhone without being Apple. At the same time, the urge to jump ship just for Apple's hardware, mostly battery life and cameras, is sitting in the back of my mind.

I liken Apple devices to appliances. As a PC enthusiast, I find that somewhat repulsive personally, but at the same time it's hard not to admire systems that just do what they're supposed to. I run all kinds of stuff on my desktops, laptops, desktops repurposed as servers... but my phone?

The thing about appliances is that everyone loves them, but "tech enthusiasts" of most stripes hate admitting it. They'll happily expect their washing machine to be just that, but heaven forbid someone wants to regard a small pocketable communication device in the same way. In reality, most people are not tech enthusiasts--just as most tech enthusiasts when confronted with a leaky kitchen sink wouldn't know what more to do beyond call the plumber. I'm happily in that category too :)

I love the current marketplace. People who want a smartphone that they regard as a general purpose computer have many options to choose from! And, the [mostly-silent] majority who prefer it to be in the form of an appliance have... well they have an option and generally tend to love it. Those like myself who enjoy a bit of one, a bit of the other, can purchase one of each or flip-flop between them.
 
A few speculations:

1. Graphics has a fixed ceiling. At the start of the 1990's computer audio was a bunch of simple beeps and tones. Then the premium sound card makers entered market and started delivering a better audio experience. By the end of the 1990's these companies began to disappear. Why? Because sound was good enough - when the $10.00 sound chip on your motherboard can deliver an audio performance beyond what your ear can experience, premium external sound cards become redundant.

PC graphics is getting closer to rendering a true experience with each graphics core generation, and both of the major players in the PC / console market have their own graphics systems. After we reach a rendered level of detail beyond what the eye can perceive there is nothing left but increased resolution, and resolution is just horsepower, storage and throughput. PC graphics are pretty excellent now, I predict we will hit the visual limits of the 'human interface' in 20 years. nVidia doesn't control one of the computing platforms, so if the time comes when the majority of people say, "The built-in graphics are good enough for me," they could get caught without a major revenue stream.

2. nVidia is still a graphics card company. They'd like to be more. nVidia would like to play with big money contracts, the kind of sales that companies like Oracle and IBM are doing. It was satisfying to nVidia to exploit their research and tech by charging high prices for their 10XX and 20XX GeForce cards, but that's in the consumer market, and no one is paying nVidia long-term maintenance contracts on their 2080ti. nVidia also felt the backlash from their consumer prices, which may be why they are dropping their prices for the 30XX series. Their are limits to what you can charge in the consumer market.

3. It's possible that ARM owns or is licensing something unrelated to ARM that nVidia wants.

Both 1 & 2 mean that nVidia should be looking to diversify. Based on the way IT is growing it appears that nVidia has chosen AI (primarily) and servers (secondarily) to grow into. ARM is a small lightweight chip that has good computational properties, can easily increase its core count, uses minimal power, and contains its own bridge and IO circuits. So ...

4. nVidia buys ARM to maintain its presence in the consumer market. It can do a lot of differentiating of the ARM portfolio while still offering the same level of ARM licensing seen now.

5. If nVidia is researching neural circuits, and those neural circuits need to be carried on a binary subsystem (many currently do), they could be looking to own and control a chip design that can be easily used as a dual threat: First for AI-style neural computations, and second for being able to step back and provide old-school binary computations. It's easy to imagine an ARM server-class chip that has 256 cores, with each core carrying a binary math and float core, a few dozen 'neural cores', and its own IO.

This is all speculation. But the need for nVidia to diversify is real - no one is going to pay big money for nVidia graphics if their eyes can't see the difference. And nVidia can't wait too long - 10 years is a short amount of time to enter and become a dominant player in a new field.
 
Last edited:
I predict we will hit the visual limits of the 'human interface' in 20 years
I wanted to scoff at this, but in terms of raw rendering power, you may not be wrong. There are plenty of other things that need to happen though, with the only part of computer generated graphics that's really starting to hit diminishing returns being resolution.
 
Nvidia purchase of Arm is a 'disaster' for the UK tech industry: Government to probe $40 billion sale of British chip designer after co-founder says it is 'the equivalent to letting Trump getting his hands on Trident'
  • The deal is still subject to regulatory approval including from the UK government
  • Hermann Hauser helped spin Arm off from former parent firm Acorn in 1990
  • He said the government should impose restrictions on the deal to protect jobs
  • Hauser said Arm is the last globally significant European technology company
  • The government says it will review the deal to check for any impact on the UK
  • Nvidia says it will keep the Arm headquarters in the UK and expand operations

The sale of British chip maker Arm to Nvidia for $40 billion (£31 billion) - described as a 'disaster' for the UK tech industry and 'the equivalent to letting Trump getting his hands on Trident', by an Arm co-found - is to be probed by the UK government.

Nvidia confirmed the deal to buy Arm on Monday, just four years after the firm was purchased by Japan's SoftBank Group for $32 billion (£24 billion) in 2016.

Arm is best known as the designer of processor chips used in most major smartphones - including both Apple and Samsung devices.

Hermann Hauser said the sale to the American chip maker would be a 'disaster for Cambridge, the UK and Europe' and see the 'last European technology company with any global relevance sold off to the Americans'.

Hauser, who helped spin Arm off from former parent company Acorn Computers in 1990, has urged the UK government to work to block the sale of the British firm.

However, Nvidia has pledged to keep Arm headquartered in Cambridge, while also promising to expand on Arm's work to build a 'world-class' technology centre.

Prime Minister Boris Johnson is said to be taking a personal interest in the deal for Arm which he described as playing a 'vital role' in the UK tech sector.
Roughly translated, the UK is pissed it won't be able to sell to China any more.

Hermann Hauser said the sale to the American chip maker would be a 'disaster for Cambridge, the UK and Europe' and see the 'last European technology company with any global relevance sold off to the Americans'.
So, a Japanese company owning it was ok (allowing sales to China), but a USA company owning it will be a disaster... give me a break.
This is all the more reason for it to get the technology away from the UK and China.

If they didn't want this to happen, then they shouldn't have sold it to a Japanese company.
Cry me a river UK - business is business.
 
Europe is already lacking in big tech. If there was no Brexit the EU would have certainly come up with some consortium to keep ARM. The world is going towards tech-nationalism and when you don't have any of the big boys, you don't have much leverage.

With ARM no longer independent, RISC-V will be the future.
 
A few speculations:

1. Graphics has a fixed ceiling. At the start of the 1990's computer audio was a bunch of simple beeps and tones. Then the premium sound card makers entered market and started delivering a better audio experience. By the end of the 1990's these companies began to disappear. Why? Because sound was good enough - when the $10.00 sound chip on your motherboard can deliver an audio performance beyond what your ear can experience, premium external sound cards become redundant.

PC graphics is getting closer to rendering a true experience with each graphics core generation, and both of the major players in the PC / console market have their own graphics systems. After we reach a rendered level of detail beyond what the eye can perceive there is nothing left but increased resolution, and resolution is just horsepower, storage and throughput. PC graphics are pretty excellent now, I predict we will hit the visual limits of the 'human interface' in 20 years. nVidia doesn't control one of the computing platforms, so if the time comes when the majority of people say, "The built-in graphics are good enough for me," they could get caught without a major revenue stream.

2. nVidia is still a graphics card company. They'd like to be more. nVidia would like to play with big money contracts, the kind of sales that companies like Oracle and IBM are doing. It was satisfying to nVidia to exploit their research and tech by charging high prices for their 10XX and 20XX GeForce cards, but that's in the consumer market, and no one is paying nVidia long-term maintenance contracts on their 2080ti. nVidia also felt the backlash from their consumer prices, which may be why they are dropping their prices for the 30XX series. Their are limits to what you can charge in the consumer market.

3. It's possible that ARM owns or is licensing something unrelated to ARM that nVidia wants.

Both 1 & 2 mean that nVidia should be looking to diversify. Based on the way IT is growing it appears that nVidia has chosen AI (primarily) and servers (secondarily) to grow into. ARM is a small lightweight chip that has good computational properties, can easily increase its core count, uses minimal power, and contains its own bridge and IO circuits. So ...

4. nVidia buys ARM to maintain its presence in the consumer market. It can do a lot of differentiating of the ARM portfolio while still offering the same level of ARM licensing seen now.

5. If nVidia is researching neural circuits, and those neural circuits need to be carried on a binary subsystem (many currently do), they could be looking to own and control a chip design that can be easily used as a dual threat: First for AI-style neural computations, and second for being able to step back and provide old-school binary computations. It's easy to imagine an ARM server-class chip that has 256 cores, with each core carrying a binary math and float core, a few dozen 'neural cores', and its own IO.

This is all speculation. But the need for nVidia to diversify is real - no one is going to pay big money for nVidia graphics if their eyes can't see the difference. And nVidia can't wait too long - 10 years is a short amount of time to enter and become a dominant player in a new field.

great post & insights.

another speculation: nVidia’s primary competition (AMD, Apple, Intel) all have General purpose CPU and GPU initiatives and products and are diversified to protect against possible long term convergence. NVidia was not, until now. As CPUs become Increasingly parallel while GPUs become increasingly general purpose, I wonder if there is a convergence coming, at least at the level of a die, that nVidia should be at the front lines for.
 
With ARM no longer independent, RISC-V will be the future.
I have been reading this for 5+ years now, and have yet to see anything serious come of it yet.
ARM is moving up, and RISC-V is, at best, used for specific tasks and is even embedded within various SoCs, rather than hosting OSes and software itself.

Not saying you are wrong, but I have seen next to zero evidence of that happening within the short to medium-term future.
 
Roughly translated, the UK is pissed it won't be able to sell to China any more.


So, a Japanese company owning it was ok (allowing sales to China), but a USA company owning it will be a disaster... give me a break.
This is all the more reason for it to get the technology away from the UK and China.

If they didn't want this to happen, then they shouldn't have sold it to a Japanese company.
Cry me a river UK - business is business.

This is the same UK government that signed a Brexit agreement, but is now violating that they signed?

https://www.bbc.com/news/uk-politics-54097320

I prefer Nvidia.

Europe is already lacking in big tech. If there was no Brexit the EU would have certainly come up with some consortium to keep ARM. The world is going towards tech-nationalism and when you don't have any of the big boys, you don't have much leverage.

With ARM no longer independent, RISC-V will be the future.

RISC-V is part of EPI

https://www.european-processor-initiative.eu/project/epi/
 
The thing about appliances is that everyone loves them, but "tech enthusiasts" of most stripes hate admitting it. They'll happily expect their washing machine to be just that, but heaven forbid someone wants to regard a small pocketable communication device in the same way. In reality, most people are not tech enthusiasts--just as most tech enthusiasts when confronted with a leaky kitchen sink wouldn't know what more to do beyond call the plumber. I'm happily in that category too :)

I love the current marketplace. People who want a smartphone that they regard as a general purpose computer have many options to choose from! And, the [mostly-silent] majority who prefer it to be in the form of an appliance have... well they have an option and generally tend to love it. Those like myself who enjoy a bit of one, a bit of the other, can purchase one of each or flip-flop between them.
Maybe this is why my opinion differs to others. I route out custom circuit boards at my house, and program MCU's for fun. If my washing machine has issues, I 3d print a new pump impeller. When I need a digital boost controller for my car, I build a circuit and program the MCU to do so. This is when I figured out apples non-existent serial bluetooth, I was writing a custom fan controller for someone and wanted to use bluetooth for control, lo and behold he would have had to jailbreak and install a 3rd party drive, so that was a no go... it did work great on android, laptops and tablets no issues though ;). This is probably why I appreciate standards for interfacing with things and being a bit more open. I don't just have a phone and use it as a phone. It's an interface for me to do all kinds of things and connect to all kinds of projects. It's not as general purpose as my PC (trust me, I've tried finding a good way to install arch linux on this thing, lol) but it's for sure not just a phone either (for me). Again, I'm sure I'm outside of the norm here, so take it with a large grain of salt but maybe it'll help other see it from a new perspective (and Im seeing others perspectives as well, there is a lot to be said about something that actually just works, as it seems this is less and less the case!!). I'm a hands on person, leaky sink, yes, I crawl under my house and replace piping. The only thing I didn't do was replace my well pump as it's very difficult to pull out of the ground without the right equipment without breaking something. Same thing with my vehicles, I have no problems swapping an engine or transmission or installing aftermarket parts, etc. I do a little bit of everything really, so when I find something like requirements for roofing, it makes my life easier as I know I'm doing it right, and the guy that put my house up isn't the only one who can tell me what shingles to use, the color and that he's the only one that can do the install for 4x the price I'd normally pay.
Apple products do tend to work, although they aren't without fault of course. Sometimes that's what people want, and that's ok, but please don't wish/force that on the rest of us that are still holding out ;).
 
A few speculations:

1. Graphics has a fixed ceiling. At the start of the 1990's computer audio was a bunch of simple beeps and tones. Then the premium sound card makers entered market and started delivering a better audio experience. By the end of the 1990's these companies began to disappear. Why? Because sound was good enough - when the $10.00 sound chip on your motherboard can deliver an audio performance beyond what your ear can experience, premium external sound cards become redundant.

PC graphics is getting closer to rendering a true experience with each graphics core generation, and both of the major players in the PC / console market have their own graphics systems. After we reach a rendered level of detail beyond what the eye can perceive there is nothing left but increased resolution, and resolution is just horsepower, storage and throughput. PC graphics are pretty excellent now, I predict we will hit the visual limits of the 'human interface' in 20 years. nVidia doesn't control one of the computing platforms, so if the time comes when the majority of people say, "The built-in graphics are good enough for me," they could get caught without a major revenue stream.

2. nVidia is still a graphics card company. They'd like to be more. nVidia would like to play with big money contracts, the kind of sales that companies like Oracle and IBM are doing. It was satisfying to nVidia to exploit their research and tech by charging high prices for their 10XX and 20XX GeForce cards, but that's in the consumer market, and no one is paying nVidia long-term maintenance contracts on their 2080ti. nVidia also felt the backlash from their consumer prices, which may be why they are dropping their prices for the 30XX series. Their are limits to what you can charge in the consumer market.

3. It's possible that ARM owns or is licensing something unrelated to ARM that nVidia wants.

Both 1 & 2 mean that nVidia should be looking to diversify. Based on the way IT is growing it appears that nVidia has chosen AI (primarily) and servers (secondarily) to grow into. ARM is a small lightweight chip that has good computational properties, can easily increase its core count, uses minimal power, and contains its own bridge and IO circuits. So ...

4. nVidia buys ARM to maintain its presence in the consumer market. It can do a lot of differentiating of the ARM portfolio while still offering the same level of ARM licensing seen now.

5. If nVidia is researching neural circuits, and those neural circuits need to be carried on a binary subsystem (many currently do), they could be looking to own and control a chip design that can be easily used as a dual threat: First for AI-style neural computations, and second for being able to step back and provide old-school binary computations. It's easy to imagine an ARM server-class chip that has 256 cores, with each core carrying a binary math and float core, a few dozen 'neural cores', and its own IO.

This is all speculation. But the need for nVidia to diversify is real - no one is going to pay big money for nVidia graphics if their eyes can't see the difference. And nVidia can't wait too long - 10 years is a short amount of time to enter and become a dominant player in a new field.
While I appreciate the well thought out list and I agree (more or less) with a good majority of it... what's to stop Nvidia from doing this with an ARM license (besides point 3 of course, but I don't know what this is or if there is anything else). I mean, they can (And do) diversify now. They can build an entire eco system from top to bottom with a cheap license. Look at amazon and others doing just that. There is nothing stopping them from throwing some arm chips together with their AI chips and selling a large contract to the government. The play for arm isn't about them expanding into these markets, they could try to do that now and compete against all the much bigger players. The play to buy ARM gives them control over the other players already in the game; they already have entry into the game, they just chose to go the roundabout way. I do agree nVidia needs to diversify, but nothing is or has been stopping them. They don't need to buy arm to build any of these things. Again, maybe it's just me be pessimistic, but I don't see what any of this has to do with then needing to purchase ARM. Sure it's all things they can and should be doing, but they don't need to own arm to do it, just license it like they did with their Shield and Nintendo Switch.
 
They don't need to buy arm to build any of these things. Again, maybe it's just me be pessimistic, but I don't see what any of this has to do with then needing to purchase ARM. Sure it's all things they can and should be doing, but they don't need to own arm to do it, just license it like they did with their Shield and Nintendo Switch.

Well the answer is no mystery since Jensen told the world. NVIDIA will leverage their GPU IP via ARM and it will be everywhere. Licensees who build phones will all have GeForce IP, Arm supercomputers will be designed with NVIDIA technology at its core, they’ll basically use Arm as a Trojan horse to seed the world with NVIDIA technology. You can’t put a price on that...or maybe you can—$12bn cash and the rest stocks, basically they got Arm for a bargain.
 
Well the answer is no mystery since Jensen told the world. NVIDIA will leverage their GPU IP via ARM and it will be everywhere. Licensees who build phones will all have GeForce IP, Arm supercomputers will be designed with NVIDIA technology at its core, they’ll basically use Arm as a Trojan horse to seed the world with NVIDIA technology. You can’t put a price on that...or maybe you can—$12bn cash and the rest stocks, basically they got Arm for a bargain.


I've been saying it for years but yeah: Nvidia has some seriously good technology.
A pity they aren't very FOSS friendly, for what that's worth. Probably not 40 billion.

But it's still a consideration.
 
Well the answer is no mystery since Jensen told the world. NVIDIA will leverage their GPU IP via ARM and it will be everywhere. Licensees who build phones will all have GeForce IP, Arm supercomputers will be designed with NVIDIA technology at its core, they’ll basically use Arm as a Trojan horse to seed the world with NVIDIA technology. You can’t put a price on that...or maybe you can—$12bn cash and the rest stocks, basically they got Arm for a bargain.
Yeah, that's the obvious thing (which I brought up in one of these threads, lol). It's the best way to break into the market and stop the cometition. Like AMD licensing/building a GPU with Samsung for ARM. This will be a thing of the past Why fight your way in when you can buy your way in. It's not dumb by any means, and a lot of businesses do this all the time. They can license their IP for a "small" up-charge with the ARM core and get their product into possibly billions of devices over the next decade. Of course, nothing was stopping them from doing this already, like AMD is doing with Samsung, but owning the market makes it much simpler than trying to compete ;). I'm not saying this as good or bad, it's just businesses and how they work.
 
Back
Top