NVidia and MediaTek to collaborate on ARM gaming laptops

Things are starting to get interesting here. I really hope Nvidia pulls it off.

I think the issue is mostly old legacy software. Anything big and current will port to ARM no problem (talking like Photoshop, MS Office, Spotify, or whatever). Especially if Apple and Microsoft are pushing for it.

It will be sad for many games on Steam, or older titles that are not digital will be abandoned. But we can still play DOS games today, so I'm sure there will be emulators that can get better over time.
 
Things are starting to get interesting here. I really hope Nvidia pulls it off.

I think the issue is mostly old legacy software. Anything big and current will port to ARM no problem (talking like Photoshop, MS Office, Spotify, or whatever). Especially if Apple and Microsoft are pushing for it.

It will be sad for many games on Steam, or older titles that are not digital will be abandoned. But we can still play DOS games today, so I'm sure there will be emulators that can get better over time.
If that laptop doesn't run steam, who is gonna honestly use it?
 
I'm not sure its even going to be that. $500 Chromebook with a 1 year Geforce Now subscription.
I wouldn’t mind that, I have to keep a Chromebook on hand for troubleshooting and testing things but I never really use it because I detest the screen, keyboard, and trackpad. If they put out a reasonably priced one I could find a way to justify that purchase.... for bragging rights really.
 
I wouldn’t mind that, I have to keep a Chromebook on hand for troubleshooting and testing things but I never really use it because I detest the screen, keyboard, and trackpad. If they put out a reasonably priced one I could find a way to justify that purchase.... for bragging rights really.

They do make some nicer hardware if you pay up for it, but if I'm going to spend $600+ on a notebook, I think I'd rather have a Windows based one for the flexibility.
 
They do make some nicer hardware if you pay up for it, but if I'm going to spend $600+ on a notebook, I think I'd rather have a Windows based one for the flexibility.
I’m sure they do, there is also the very real possibility I am a fussy little bitch and too dammed picky when it comes to my keyboards. I’ve been spoiled by the Latitudes and Alienwares.
 
I have a few Chrome laptops. The Lenovo is a 2-in-1, it's actually quite nice build quality. It was $280 and not bad for the price.

That said, I tried installing Steam under Linux. I did get Steam to install, but I could barely play any games. L4D was like 1fps at most, and even simple games like Celeste were unplayable.

But I was surprised I was even able to get that far. I wouldn't count them out. If Nvidia can make a Chromebook with entry-level gaming performance for like $300/$400 it would be a seller.
 
That said, I tried installing Steam under Linux. I did get Steam to install, but I could barely play any games. L4D was like 1fps at most, and even simple games like Celeste were unplayable.
How? L4D is native on Linux and you don't have to do anything but install the game. Unless you installed it on a ARM based Chrome book or used an Nvidia GPU without installing their drivers.?
But I was surprised I was even able to get that far. I wouldn't count them out. If Nvidia can make a Chromebook with entry-level gaming performance for like $300/$400 it would be a seller.
I'm actually installing Dead Space 2 on Linux through Wine and I call that a win. Though one with an Intel GPU.
 
Ah, I'm sorry. I did the game test on my Samsung, which uses a Celeron chip. The Lenovo is MediaTek but I got it mixed up.

If I recall correct, for some reason Linux couldn't see the GPU or something like that. I mean, there was video on the screen but it was all messed up.

I guess my point was that if Nvidia could make a ChromeOS like device, but with a somewhat decent GPU, that would be huge for students or just people on the go that don't want a "gaming laptop" burn your balls off monstrosity.
 
If that laptop doesn't run steam, who is gonna honestly use it?
Governments, corporations, developers, college students, schools - depending on the price, these units are going to sell, and hard.
Nearly everyone in this thread keeps bringing up the gaming aspect, and if the gaming aspect were the only aspect of these laptops, I would say they are totally correct, but that is no longer the case, and the scope of the current market and industry is far more vast than solely gaming.

Lower cost hardware and games propelled x86 computing in the 1980s through the 2010s, and much of the development and advances in nearly all PC-related hardware are owed a thanks to game development and gamers alike.
With the new functions of ARM on the horizon - low power usage, high performance per form factor, high battery life, AI, HPC, back-end development and tools, low costs, low heat output, tightly coupled GPU/SoC designs, and high-level software support - these will be the factors that propel ARM computing into the 2020s and beyond, not gaming.

In fact, if things continue the way they have been for the last year, PC gaming will be all but dead outside of the console market due to lack of GPU availability for the average individual.
Either that, or a massive paradigm shift is going to have to happen in hardware to either allow more powerful iGPUs to be present in APUs/SoCs, or PC gaming in general is going to have to shift to accommodate current iGPU technology levels in order to retain a relevant market share with even modest profits; either way, something has to give.

If things do start to shift towards ARM/NVIDIA, Valve will be all but obligated to work with them to get an ARM-based Steam client (regardless of the OS), and start working to convince game developers and software companies to use higher-level programming languages and APIs to accommodate ARM/NVIDIA as well as x86, or risk corporate suicide over the next 10-15 years.
Things are changing, and while AMD is keeping x86-64 competitive for the foreseeable future, Intel is coasting and doing little to improve their situation, or that of x86 in general, and are going to let everyone down - including the 40 years of gamers who virtually funded them to where they are now.
 
Last edited:
If your wondering who will buy these.... all you have to do is consider M1. Apple is selling a ton of M1 machines right now. Nvidia is likely going to offer a Windows running alternative with hardware they will spin as > then M1.... on a machine that may also be able to run Linux. (but at least it will probably support Linux on Windows for the student and engineer types)

How will gaming work out on them ? That is harder to say... I am sure if Nvidia goes down this path they are going to have a plan. Apple built their software and hardware stack to help them emulate x86 code. I think Nvidia will do the same... only by talking to companies like Mediatek I would have to assume it would lean more heavily on the software side, unless Mediatek isn't going to be producing the CPU (in which case why do they need mediatek unlesss this won't be a SOC at all and Mediatek will be making ARM chipssets... which seems unlikely) However who would be better for building a software layer that would allow games to run on Arm then Nvidia. They can leverage current developers via the way it was meant to be played... and I am pretty sure they would have all the contacts they would need to push the industry to provide profiles for older games. (Valve tried to get developers to provide Linux profiles for Linux steam... but official profiles are pretty few) Nvidia can easily apply the pressure required to ensure 90% of the game developers in the world provide profiles for their translation layer to ensure things run smooth. Also the target systems will be standard... Linux profile support is a harder sell as hardware and distro differences make one official (or even unofficial) profile hard to support.

*** as an aside I can see the marketing now. If nvidia closes on ARM before launch.. with a new Nvidia Cyan colored The way its meant to be played logo. lol
 
If your wondering who will buy these.... all you have to do is consider M1. Apple is selling a ton of M1 machines right now. Nvidia is likely going to offer a Windows running alternative with hardware they will spin as > then M1.... on a machine that may also be able to run Linux. (but at least it will probably support Linux on Windows for the student and engineer types)

How will gaming work out on them ? That is harder to say... I am sure if Nvidia goes down this path they are going to have a plan. Apple built their software and hardware stack to help them emulate x86 code. I think Nvidia will do the same... only by talking to companies like Mediatek I would have to assume it would lean more heavily on the software side, unless Mediatek isn't going to be producing the CPU (in which case why do they need mediatek unlesss this won't be a SOC at all and Mediatek will be making ARM chipssets... which seems unlikely) However who would be better for building a software layer that would allow games to run on Arm then Nvidia. They can leverage current developers via the way it was meant to be played... and I am pretty sure they would have all the contacts they would need to push the industry to provide profiles for older games. (Valve tried to get developers to provide Linux profiles for Linux steam... but official profiles are pretty few) Nvidia can easily apply the pressure required to ensure 90% of the game developers in the world provide profiles for their translation layer to ensure things run smooth.

*** as an aside I can see the marketing now. If nvidia closes on ARM before launch.. with a new Nvidia Cyan colored The way its meant to be played logo. lol

I think Mediatek shows that this is nothing more than a well built front end for Geforce Now. If it were a purpose built machine for gaming, it would have a purpose built SOC from Nvidia with their graphics built in and not a Mediatek SOC.
 
Ah, I'm sorry. I did the game test on my Samsung, which uses a Celeron chip. The Lenovo is MediaTek but I got it mixed up.

If I recall correct, for some reason Linux couldn't see the GPU or something like that. I mean, there was video on the screen but it was all messed up.
That seems odd. I've installed Linux on my laptops with Intel GPU's and for the most part they work fine. As fine as an Intel GPU can work.
I guess my point was that if Nvidia could make a ChromeOS like device, but with a somewhat decent GPU, that would be huge for students or just people on the go that don't want a "gaming laptop" burn your balls off monstrosity.
Personally, I'd rather have a Ryzen laptop with an APU for graphics. Doesn't get hot and still does well enough for gaming. Most gaming laptops are over the top insane piles of shit, with enough battery life to last you 1 hour with slower gaming performance, and enough copper heat pipes to make it heavy. Too bad you can't just take what the PS5 and the Xbox Series X has instead and put it in a laptop, because AMD wouldn't allow that.

Governments, corporations, developers, college students, schools - depending on the price, these units are going to sell, and hard.
Most governments and corporations will probably stick with x86 for compatibility with older software. Developers will have both ARM and x86 because they're developing for multiple platforms. College Students can get away with using ARM but they wouldn't be gaming on them. Schools are already using ARM Chromebooks because they're cheap and because of COVID19. I wouldn't say these would sell, especially hard.
Nearly everyone in this thread keeps bringing up the gaming aspect, and if the gaming aspect were the only aspect of these laptops, I would say they are totally correct, but that is no longer the case, and the scope of the current market and industry is far more vast than solely gaming.
We're taking about Nvidia collaborating to make gaming laptops. If you wanted an ARM laptop for business then you would go with Apple's M1 or Microsoft's Surface laptops. Most Surface laptops are ending up collecting dust while Apple M1 users will be doing the same as well. Neither I would consider to be used primarily for gaming.
With the new functions of ARM on the horizon - low power usage, high performance per form factor, high battery life, AI, HPC, back-end development and tools, low costs, low heat output, tightly coupled GPU/SoC designs, and high-level software support - these will be the factors that propel ARM computing into the 2020s and beyond, not gaming.
If gaming wasn't important then people would stick with their Core2Duo laptops and be perfectly happy. All work and no play makes people buy expensive laptops to play their games. That's why Bob from accounting needs a laptop with a discrete GPU for... accounting reasons.
In fact, if things continue the way they have been for the last year, PC gaming will be all but dead outside of the console market due to lack of GPU availability for the average individual.
Either that, or a massive paradigm shift is going to have to happen in hardware to either allow more powerful iGPUs to be present in APUs/SoCs, or PC gaming in general is going to have to shift to accommodate current iGPU technology levels in order to retain a relevant market share with even modest profits; either way, something has to give.
More likely the industry will just accommodate the fact that everyone is still using GTX 1060's. In fact, depending on the presence of PS4's and Xbox One's, the industry may continue to support that hardware as well. This will suck for those using RTX and RX 6000's GPU's, as well as PS5 and Xbox Series but that's how things work. We are in an economic collapse so pushing for more expensive hardware with less return in performance is not going to end well for the industry. You're going to have an ET moment when someone is going to make a game that requires such expensive hardware that it will throw the industry upside down. As a game developer you either make a game with low system requirements and fuck PS5 and RTX owners, or you make a game that demands hardware and fuck PS4 and GTX owners. The former will likely happen. At which point newer cheaper hardware will likely flood the market, probably not from AMD or Nvidia. ARM will likely make this messy as performance and compatibility will be worse.
If things do start to shift towards ARM/NVIDIA, Valve will be all but obligated to work with them to get an ARM-based Steam client (regardless of the OS), and start working to convince game developers and software companies to use higher-level programming languages and APIs to accommodate ARM/NVIDIA as well as x86, or risk corporate suicide over the next 10-15 years.
Valve will likely support ARM because Valve is very accommodating. They already support Apple's M1 with the release of Metro Exodus. Valve doesn't care for what CPU or OS you use, as long as you're using Steam. Valve though won't be forcing developers to make ARM binaries, just like they aren't forcing developers to make Linux binaries.
Things are changing, and while AMD is keeping x86-64 competitive for the foreseeable future, Intel is coasting and doing little to improve their situation, or that of x86 in general, and are going to let everyone down - including the 40 years of gamers who virtually funded them to where they are now.
Reports of Intel's death have been greatly exaggerated. Intel won't be going anywhere anytime soon, along with x86. Yes, Intel fucked up but so has AMD and Apple. Apple fucked up years ago by using PowerPC, and AMD fucked up with their Bulldozer based CPU's. Apple recovered by going x86, and AMD recovered by creating Ryzen. Probably by next year Intel will have competitive products that will be using TSMC's or Samsung's manufacturing process, as well as a massive new redesign of their x86 products that would put the Apple M1 to shame, as well as the Apple M2. I'd be worried about AMD as their new APU's are still based on 7nm and still use Vega graphics. As opposed to using 5nm and RDNA based graphics. Seems silly they aren't using their own technology to make their products competitive, but soon enough we'll even get Intel's new GPU's as well.

As for Nvidia, lets just say that without a standard platform for other ARM competitors to compete on, you won't see any traction from consumers. Much like other Nvidia products, they don't play nice with others and that'll hurt ARM's future.
 
Last edited:
I think Mediatek shows that this is nothing more than a well built front end for Geforce Now. If it were a purpose built machine for gaming, it would have a purpose built SOC from Nvidia with their graphics built in and not a Mediatek SOC.

Your likely correct. I mean I can see Nvidia building a M1 competitor... and I suspect they are working on something. But ya Mediatek ? It does seem unlikely to be a high performance CPU part.... unless they are providing something other then a SOC which seems unlikely.

This is probably not the leak we are looking for. lol
 
If your wondering who will buy these.... all you have to do is consider M1. Apple is selling a ton of M1 machines right now.
I don't know about that. Right now Apple is slashing the prices of their M1 products. Their sales are anywhere from $50 to up to $200 in discount. Seems like an odd thing to do when M1 sales are considered to be doing well, during a time period where anything made out of silicon is overpriced due to demand.
Nvidia is likely going to offer a Windows running alternative with hardware they will spin as > then M1.... on a machine that may also be able to run Linux. (but at least it will probably support Linux on Windows for the student and engineer types)
Most likely no Linux, as Nvidia isn't Linux friendly. If anything Windows or even Android.
Nvidia can easily apply the pressure required to ensure 90% of the game developers in the world provide profiles for their translation layer to ensure things run smooth.
Nvidia's power is an illusion. Nvidia's Shield products failed, as well as Geforce Now. What makes you think Nvidia's influence will have a greater effect here? Especially one that requires more resources from game developers?
 
Last edited:
  • Like
Reactions: ChadD
like this
I don't know about that. Right now Apple is slashing the prices of their M1 products. Their sales are anywhere from $50 to up to $200 in discount. Seems like an odd thing to do when M1 sales are considered to be doing well, during a time period where anything made out of silicon is overpriced due to demand.

Most likely no Linux, as Nvidia isn't Linux friendly. If anything Windows or even Adroid.

Nvidia's power is an illusion. Nvidia's Shield products failed, as well as Geforce Now. What makes you think Nvidia's influence will have a greater effect here? Especially one that requires more resources from game developers?
Well it doesn't matter if Nivida is Linux friendly for Linux on windows. I imagine MS will look to add that to Windows 10 ARM should MS be powering another wave of ARM windows machines. :) Will Nvidia make it easy to boot Linux on a Nvidia ARM machine ya probably not.

Nvidia with their gaming stuff I think underestimated the power of the game publishers. They will not be cut out... and thanks to the pre steam days the big publishers own the majority of the big gaming franchises. I think Nvidia is going to give it another try... you may be right its not going to be easy. As crazy as it is I think it all revolves around Apple. If their pro machines perform and start gaining creative market share which I believe will be the case. The push for a Windows powered alternative will grow. Yes Microsoft will make their own chip, and Qcomm may want to try again. But if anyone is going to compete with Apple it will be Nvidia... and that may get them in with some unexpected partners.

Should be an interesting few years anyway. Considering the chip shortages I'm sure all we get for two years now is leaks and paper blue sky announcements at best.
 
Well it doesn't matter if Nivida is Linux friendly for Linux on windows. I imagine MS will look to add that to Windows 10 ARM should MS be powering another wave of ARM windows machines. :) Will Nvidia make it easy to boot Linux on a Nvidia ARM machine ya probably not.

Nvidia with their gaming stuff I think underestimated the power of the game publishers. They will not be cut out... and thanks to the pre steam days the big publishers own the majority of the big gaming franchises. I think Nvidia is going to give it another try... you may be right its not going to be easy. As crazy as it is I think it all revolves around Apple. If their pro machines perform and start gaining creative market share which I believe will be the case. The push for a Windows powered alternative will grow. Yes Microsoft will make their own chip, and Qcomm may want to try again. But if anyone is going to compete with Apple it will be Nvidia... and that may get them in with some unexpected partners.

Should be an interesting few years anyway. Considering the chip shortages I'm sure all we get for two years now is leaks and paper blue sky announcements at best.
Nvidia's linux ARM drivers are in good shape (and open source, iirc). The only real issue is whether the rest of the system is supported and whether you are allowed to muck with it (install linux or whatever).
 
  • Like
Reactions: ChadD
like this
Nvidia's linux ARM drivers are in good shape (and open source, iirc). The only real issue is whether the rest of the system is supported and whether you are allowed to muck with it (install linux or whatever).
I agree I'm not sure I buy the Nvidia hates linux stuff as much as some do. Yes they don't support the people making open source drivers for their GPUs. However they are the ONLY company that has a closed source Linux driver that is 100% on par with their windows driver.

Yes Intel and AMD both fully support the open source mesa drivers and supply their own kernel drivers to Linus.

Nvidia is however the only one that looks as Linux as a Money maker. Hence they keep their driver closed. And to really in many ways their Linux driver is more advanced then their windows gaming drivers. All the stuff they are selling for AI and Datacenter isn't running windows.

I know its a super long shot...but it makes me wonder if Nvidia may not be toying with the idea of their own OS. Intel does have their own OS in Clear Linux. IBM has Red hat. I don't think its crazy to think Nvidia might someday build their own server and desktop distro. They are not married to FOSS no... but I could see them rolling out a Distro that required a NV GPU as it only shipped with a kernel pre baked with their closed source driver. What better way to win bench marks then to control the distro. Right now Intels Clear is the fastest Linux distro... If Nvidia could roll out a distro with a Datacenter version with stability first... and a desktop version with all the speed tweaks pre enabled as Intel does.

I mean do we need a Nvidia version of ChromeOS ? No. But I agree with you Nvidia isn't anti Linux.. I think they simply see the potential it offers them to make money. Nvidia is an ambitious company... I know they have considered how best to replace Intel. But I wouldn't be surprised if they have also considered how they may best replace MS as well.
 
Well it doesn't matter if Nivida is Linux friendly for Linux on windows. I imagine MS will look to add that to Windows 10 ARM should MS be powering another wave of ARM windows machines. :) Will Nvidia make it easy to boot Linux on a Nvidia ARM machine ya probably not.
Which is why mentioning Linux and Nvidia together is usually accompanied with hateful statements.
Nvidia with their gaming stuff I think underestimated the power of the game publishers. They will not be cut out... and thanks to the pre steam days the big publishers own the majority of the big gaming franchises. I think Nvidia is going to give it another try... you may be right its not going to be easy. As crazy as it is I think it all revolves around Apple. If their pro machines perform and start gaining creative market share which I believe will be the case. The push for a Windows powered alternative will grow. Yes Microsoft will make their own chip, and Qcomm may want to try again. But if anyone is going to compete with Apple it will be Nvidia... and that may get them in with some unexpected partners.
If Nvidia, Qualcomm, and Microsoft try again, is because Apple's M1 is a shinning example that it can be done. Then again Blizzard's World of Warcraft shows that MMO's can be successful and yet they are the only successful MMO out there. Whenever Apple does something, everyone else must copy them, even down to their faults.

I agree I'm not sure I buy the Nvidia hates linux stuff as much as some do. Yes they don't support the people making open source drivers for their GPUs. However they are the ONLY company that has a closed source Linux driver that is 100% on par with their windows driver.
That's nothing to be proud of. Intel doesn't have closed source drivers for Linux, only open source. AMD has both and as far as I know both are equally as good or better than their Windows drivers. Most of AMD's closed source drivers for Linux are mostly made of open source drivers, with the exception of their Vulkan driver and I feel that RADV is better anyway, which is open source. The RadeonSi driver is so good that The Good Old gamer made a statement in how AMD finally fixed their OpenGL performance for RDNA, without realizing that the open source OpenGL drivers on Linux are vastly superior for all of AMD GPU's. I made him look infinitely stupid by pointing that out.

Nvidia's Linux binary drivers are good but I wouldn't say that matters. When Cyberpunk 2077 was released, you couldn't play it reliably on Nvidia because their drivers don't have a feature that is found on AMD's MESA drivers. Not sure if Nvidia updated the drivers to fix this, but AMD was able to play Cyberupnk 2077 from the start. It was Valve who implemented the Vulkan feature that allowed AMD to play Cyberpunk, because Valve is involved in Vulkan development on both AMD and Intel. The open source drivers for Nvidia are so bad that nobody in their right mind would use them besides to install their binary drivers.
Nvidia is however the only one that looks as Linux as a Money maker. Hence they keep their driver closed. And to really in many ways their Linux driver is more advanced then their windows gaming drivers. All the stuff they are selling for AI and Datacenter isn't running windows.
That's not why Nvidia doesn't work on open source drivers. You could easily modify the drivers to do things that Nvidia doesn't want you to do. You know, like take a mining card and use it for gaming? Or my favorite, use a regular Nvidia gaming card and through a virtual machine you can install Nvidia's Quadro drivers. AMD did the same thing until it wasn't cost effective as open source drivers were advancing much faster than their binary drivers. AMDGPU wasn't for GCN1.0 cards but the community stepped in and made it work for those older cards. Same goes for Gallium Nine as nobody supports this feature outside of Gallium based drivers, meaning open source.
I know its a super long shot...but it makes me wonder if Nvidia may not be toying with the idea of their own OS. Intel does have their own OS in Clear Linux. IBM has Red hat. I don't think its crazy to think Nvidia might someday build their own server and desktop distro.
Clear Linux is still Linux and still runs on AMD hardware, as well as Red Hat. Personally, fuck Red Hat and what they did with CentOS.

Right now Intels Clear is the fastest Linux distro...
It's also terrible to use. It won't be long before other distros adopt what Clear Linux is doing and catches up in performance. I use Xanmod kernel with Linux Mint and gained some performance.
I mean do we need a Nvidia version of ChromeOS ?
Honestly, fuck ChromeOS. The only reason it sells so well is because people just use it to browse the web and that's all it's good for. I'd personally remove ChromeOS for any Linux distro, but that's just me. The people buying ChromeOS aren't tech savvy, and don't understand what they're buying.
 
Maybe they could revive the MXM standard, which is slowly dying. Also, AMD + Qualcomm would be a good contender in this scenario.
 
Normally I would agree, but the laptop market is bigger, and the low power envelope of mobile plays to all of ARM's strengths and can take advantage of all of x86's weaknesses. It will result in a product that seems much better in comparison, and that would be far harder to do in a Desktop space where the higher power envelopes give x86 a larger advantage. There is also another side of this, by doing this NVidia is essentially developing a console, and by working with one of the major existing ARM providers they are displaying that they aren't rocking the boat and they don't intend to make big changes to the existing ARM models which can help with their arguments on why owning ARM isn't anti-competitive. Starting with a mobile spec also greatly simplifies the design process, there are expectations from a Desktop about future upgradability and part compatibility, and reliance on existing standards, none of those really exist in the mobile space. They decide that an 8v setup is most efficient. they can do that, wherein a desktop they would have to stick to the existing ATX standards, PCIE specifications, and blah blah blah blah, in Mobile they can design it from the ground up as proprietary as they need to to achieve maximum results.

ARM isn't low power. ARM is more efficient. So you can use that superior efficiency for providing (i) the same performance within smaller power envelope, or (ii) higher performance for the same power envelope, or (iii) a combination of both.

There are 250W ARM SKUs, and there is no architectural limits to making higher power SKUs (the limits are physical, i.e. the limits of cooling systems). ARM supports ATX standards, PCIE specifications, and all that. The reason why Nvidia and MediaTek are focusing on laptops is because that is what most people want. Desktop is a niche market.
 
Easier said than done. The main thing here is support, as in who's going to support a new platform with new problems? This is why Witcher 3 never came to Linux because CD Project Red got a bad reputation from poor Witcher 2 support. It's easy to compile a binary for Linux or ARM but that doesn't end there. Also, chances are games made for Nvidia's ARM laptops will only work on Nvidia's ARM laptops. Other people do make ARM based SOC's and most developers aren't going to waste their time porting games to support a very narrow and limited market. Which means we're going back to emulation of x86 which will be slower than x86 guaranteed. The Apple M1 has proven that Apple users are willing to deal with x86 emulation that's 25% to 50% slower than native, not that x86 emulation is viable for gaming. Even at $600 I don't see people buying these like crazy.

Nvidia needs to do a lot of things to move forward with ARM gaming.
#1 Open source drivers in Linux.
#2 Convince Valve to make an ARM Steam client.
#3 Start selling motherboards with sockets for enthusiasts.

Also, isn't MediaTek the SoC maker who floods the market with cheap crappy SoC's? Usually found in Kodi boxes? There's so many Chinese ARM based SoC manufacturers I can never tell.

Microsoft and Sony both wanted ARM for their consoles, but used x86 because ARM was only 32bit then. Carmack also publicly liked the idea of ARM-based consoles. I believe developers are open to the nice idea of using a single modern architecture (ARM) for the entire gaming ecosystem.

Reports of Intel's death have been greatly exaggerated. Intel won't be going anywhere anytime soon, along with x86. Yes, Intel fucked up but so has AMD and Apple. Apple fucked up years ago by using PowerPC, and AMD fucked up with their Bulldozer based CPU's. Apple recovered by going x86, and AMD recovered by creating Ryzen. Probably by next year Intel will have competitive products that will be using TSMC's or Samsung's manufacturing process, as well as a massive new redesign of their x86 products that would put the Apple M1 to shame, as well as the Apple M2. I'd be worried about AMD as their new APU's are still based on 7nm and still use Vega graphics. As opposed to using 5nm and RDNA based graphics. Seems silly they aren't using their own technology to make their products competitive, but soon enough we'll even get Intel's new GPU's as well.

x86 has been consistently loosing marketshare until the point it is no longer the mainstream architecture. ARM is the #1 architecture. Apple recovered by going x86 because ARM was not an option then. But Apple is now replacing x86 by ARM.

Good luck believing that future Intel products will put the Apple M1 and M2 to shame. :D
 
Last edited:
Oh boy, I can't wait for yet another ARM chipset that requires some proprietary firmware or other binary blobs to work.

In seriousness I'd welcome this if it meant the chip was high powered and FOSS/libre all the way through, but knowing Nvidia who is addicted to always having a new proprietary widget that isn't likely to happen.
 
Nvidia's power is an illusion. Nvidia's Shield products failed, as well as Geforce Now. What makes you think Nvidia's influence will have a greater effect here? Especially one that requires more resources from game developers?
The Shield TV & Shield TV Pro were updated in 2020. Definitely not failing when they sell quite fast & many people love them.
 
ARM isn't low power. ARM is more efficient. So you can use that superior efficiency for providing (i) the same performance within smaller power envelope, or (ii) higher performance for the same power envelope, or (iii) a combination of both.

There are 250W ARM SKUs, and there is no architectural limits to making higher power SKUs (the limits are physical, i.e. the limits of cooling systems). ARM supports ATX standards, PCIE specifications, and all that. The reason why Nvidia and MediaTek are focusing on laptops is because that is what most people want. Desktop is a niche market.
It's kinda hard to tell how x86 vs ARM works in efficiency since there's no Apple to AMD/Intels comparison. Apple's M1 uses 5nm while AMD users 7nm and Intel is stuck on 14nm. That plays a huge role in power consumption. Then there's the GPU because we are talking about laptops and nearly everyone has a GPU built into their CPU, and Apple's M1 GPU kinda sucks compared to AMD. Superior compared to Intel but Intel forgot about graphics until recently. I know someone is ready to point out some GeekBench score but in reality the M1 sucks for gaming. Then there's the ram which on the Apple M1 is setting right next to the SoC, which cuts down on power. Most x86 laptops have slots which decrease performance while also increasing power consumption.

Microsoft and Sony both wanted ARM for its consoles, but used x86 because ARM was only 32bit then. Carmack also publicly liked the idea of ARM-based consoles. I believe developers are open to the nice idea of using a single modern architecture (ARM) for the entire gaming ecosystem.
That was back before the PS4 and Xbox One was released. Also who cares? These are consoles. Back in the 90's most consoles used MIPS, and in the 2000's most consoles used PowerPC, and now we're between x86 and ARM. Console manufacturers go with what's cheaper and available and this time it was x86. ARM would have made sense but since AMD had working 64-bit and going ARM this time would suck for backwards compatibility, it made sense to go with x86 again.
x86 has been consistently loosing marketshare until the point it is no longer the mainstream architecture. ARM is the #1 architecture.
ARM is #1 for the same reason Linux is #1, in that most devices in the world use ARM. Your car stereo, your smartphone, your tablet, you WiFi router, and the list goes on. That doesn't really matter on the desktop market where ARM and Linux have nearly no market share. There's many good reasons why ARM and Linux have no market share still.
Apple recovered by going x86 because ARM was not an option then. But Apple is now replacing x86 by ARM.
I don't think people understand how Apple going x86 was such a good move. Not only was Intel kicking the crap out of IBM's PowerPC, but many Apple users were upset that Windows users got applications they couldn't use or if they could it would then run like crap slow. Going x86 fixed that by allowing Parallels and Bootcamp to give Apple users methods to run x86 Windows applications with decent performance.

Going ARM throws that out the window, not because ARM is superior but that Apple has built such an empire with iOS that pushing Mac to use ARM makes a whole lot of sense. iOS dwarfs MacOSX by a lot, which is why the move to ARM makes sense. That's not good for consumers, but it's really good for Apple.
Good luck believing that future Intel products will put the Apple M1 and M2 to shame. :D
I'm seriously worried about Apple since they want to move towards ARM. They could do it, but don't expect this transition to happen quickly. Also, I expect Apple to lose market share over time. Windows x86 isn't going to move towards ARM anytime soon, which means for Apple ARM users that gaming is going to take a very hard back seat. Unless you like Mobile games because that's a thing Apple M1 users now have access to. Hurray?

Give it a couple of years and there's going to be new videos like this making fun of gaming on the Apple Mac ARM laptops.


The Shield TV & Shield TV Pro were updated in 2020. Definitely not failing when they sell quite fast & many people love them.
They were updated in 2019 and the difference is the Nvidia Tegra X1+ which is still the same chip from 2015 but can now upsale 4k with AI, bluetooth support is now 5.0, and Dolby Vision. The new Cylindrical model is 1GB less RAM and only 8GB of storage, which is worse than the 2015 models. For only $145 I could use it as a weapon to attack anyone that thinks this thing can play modern AAA games.

Nvidia-Shield-TV-4.jpg
 
They have to start someplace. It's clear Apple and MS are both taking ARM serious and ARM will become more predominant in the future.

IMHO we are in a bit of chicken vs the egg scenario with both hardware and software in it's infancy. Getting a standard in place and some decent hardware available seems like a good place to start. I'm sure it will be incredibly limited in the beginning, however the end game could be pretty cool. I would be interested in a sleek laptop, that runs games better than a switch, without the Nintendo limited ecosystems. Especially if it can run Windows 10 ARM.
 
Personally, I'd rather have a Ryzen laptop with an APU for graphics. Doesn't get hot and still does well enough for gaming. Most gaming laptops are over the top insane piles of shit, with enough battery life to last you 1 hour with slower gaming performance, and enough copper heat pipes to make it heavy. Too bad you can't just take what the PS5 and the Xbox Series X has instead and put it in a laptop, because AMD wouldn't allow that.
I believe that might be due to contracts in place with Sony and Microsoft, rather than AMD simply refusing to do so, since there is definitely a market for such an APU in a laptop.
Most governments and corporations will probably stick with x86 for compatibility with older software. Developers will have both ARM and x86 because they're developing for multiple platforms. College Students can get away with using ARM but they wouldn't be gaming on them. Schools are already using ARM Chromebooks because they're cheap and because of COVID19.
That is starting to shift from what I have seen, as ARM is viable enough to be competitive with x86-64 with the Apple M1, though it would be nice to have more options.
We're taking about Nvidia collaborating to make gaming laptops. If you wanted an ARM laptop for business then you would go with Apple's M1 or Microsoft's Surface laptops. Most Surface laptops are ending up collecting dust while Apple M1 users will be doing the same as well. Neither I would consider to be used primarily for gaming.
This is true, agreed.
If gaming wasn't important then people would stick with their Core2Duo laptops and be perfectly happy. All work and no play makes people buy expensive laptops to play their games. That's why Bob from accounting needs a laptop with a discrete GPU for... accounting reasons.
I get what you are saying, though a C2D would be far too weak for even basic workstation and computer usage in this era with modern software.
Even a Dell Wyse thin client has far more powerful CPUs and iGPUs than even the best C2D or C2Q.
More likely the industry will just accommodate the fact that everyone is still using GTX 1060's. In fact, depending on the presence of PS4's and Xbox One's, the industry may continue to support that hardware as well. This will suck for those using RTX and RX 6000's GPU's, as well as PS5 and Xbox Series but that's how things work. We are in an economic collapse so pushing for more expensive hardware with less return in performance is not going to end well for the industry. You're going to have an ET moment when someone is going to make a game that requires such expensive hardware that it will throw the industry upside down. As a game developer you either make a game with low system requirements and fuck PS5 and RTX owners, or you make a game that demands hardware and fuck PS4 and GTX owners. The former will likely happen. At which point newer cheaper hardware will likely flood the market, probably not from AMD or Nvidia. ARM will likely make this messy as performance and compatibility will be worse.
After what happened with Cyberpunk 2077, I agree with this statement.
Trying to make another Crysis without hardware scaling, and only demanding top-tier hardware, will be a death sentence for the development studio and publishers in this market.
Valve will likely support ARM because Valve is very accommodating. They already support Apple's M1 with the release of Metro Exodus. Valve doesn't care for what CPU or OS you use, as long as you're using Steam. Valve though won't be forcing developers to make ARM binaries, just like they aren't forcing developers to make Linux binaries.
Let's hope so!
Reports of Intel's death have been greatly exaggerated. Intel won't be going anywhere anytime soon, along with x86. Yes, Intel fucked up but so has AMD and Apple. Apple fucked up years ago by using PowerPC, and AMD fucked up with their Bulldozer based CPU's. Apple recovered by going x86, and AMD recovered by creating Ryzen. Probably by next year Intel will have competitive products that will be using TSMC's or Samsung's manufacturing process, as well as a massive new redesign of their x86 products that would put the Apple M1 to shame, as well as the Apple M2. I'd be worried about AMD as their new APU's are still based on 7nm and still use Vega graphics. As opposed to using 5nm and RDNA based graphics. Seems silly they aren't using their own technology to make their products competitive, but soon enough we'll even get Intel's new GPU's as well.
Intel isn't dead, and it is still very profitable due to vendor lock-in and OEM contracts - this isn't going to hold them forever, though, and everyone was saying they would be back in the game with 10nm this year, and it isn't happening - in fact, performance went backwards, which I did not think would happen.
Apple using PowerPC was actually a very competitive move back in the early 1990s when m68k was at a total dead end, and it just wasn't feasible for IBM to continue to invest money into a CPU ISA that virtually were only included in top-end workstations with limited sales in 2005, so the choice to move to Intel was profitable for both Apple and IBM at that time.

As for AMD with Bulldozer, yes, they messed up badly with that architecture by investing into a weak-cored and heavily threaded design, of which the benefits seen from was years too early to see the benefits properly; we won't get into the TDP of said designs...
Intel is basically in the same spot with their current offerings again, and this is playing out much like Netburst, only with far more competitors and feasible options in the market in both software and hardware than was present back in the 2000s.

ARM is going to have a bright future, and it may inadvertently be because of Intel's bumbling and mismanagement, much to their chagrin.
 
I'm seriously worried about Apple since they want to move towards ARM. They could do it, but don't expect this transition to happen quickly. Also, I expect Apple to lose market share over time. Windows x86 isn't going to move towards ARM anytime soon, which means for Apple ARM users that gaming is going to take a very hard back seat. Unless you like Mobile games because that's a thing Apple M1 users now have access to. Hurray?

Give it a couple of years and there's going to be new videos like this making fun of gaming on the Apple Mac ARM laptops.

Apple has said it expects to complete the transition within the two years after announcement, so... 2022. I'm sure x86 Macs will continue to get support for years after that, but the hardware lineup will move over quickly. Remember, Apple switched to Intel at a similar pace in 2006-2007, and it didn't have as much experience back then.

The company isn't likely to lose market share, either. Gartner estimates that Apple's market share climbed 12.6% year-over-year in the first quarter of 2021 — and remember, that's despite virtually every other PC maker seeing a pandemic-related surge in sales, and despite most of Apple's lineup using somewhat outdated Intel tech. I know you think gaming is the center of the universe, but it turns out people just want... good laptops. Imagine that.
 
It's kinda hard to tell how x86 vs ARM works in efficiency since there's no Apple to AMD/Intels comparison. Apple's M1 uses 5nm while AMD users 7nm and Intel is stuck on 14nm. That plays a huge role in power consumption. Then there's the GPU because we are talking about laptops and nearly everyone has a GPU built into their CPU, and Apple's M1 GPU kinda sucks compared to AMD. Superior compared to Intel but Intel forgot about graphics until recently. I know someone is ready to point out some GeekBench score but in reality the M1 sucks for gaming. Then there's the ram which on the Apple M1 is setting right next to the SoC, which cuts down on power. Most x86 laptops have slots which decrease performance while also increasing power consumption.
It would be better to have a more powerful GPU paired with a M1, rather than with the integrated GPU currently offered.
From the videos I have watched, games on the M1 have felt less bound by the CPU and far more by the GPU just by how the game is running consistently and inconsistently.
That was back before the PS4 and Xbox One was released. Also who cares? These are consoles. Back in the 90's most consoles used MIPS, and in the 2000's most consoles used PowerPC, and now we're between x86 and ARM. Console manufacturers go with what's cheaper and available and this time it was x86. ARM would have made sense but since AMD had working 64-bit and going ARM this time would suck for backwards compatibility, it made sense to go with x86 again.
ARM didn't release with 64-bit offerings until after the last console generation released, and that is an important point because it was a major factor in why both Sony and Microsoft went with x86-64 with a powerful iGPU - AMD was literally the only option at that point in time that was feasible.
Had ARM been in a state that it was in by the late 2010s or early 2020s, it would have made much more sense to do so than x86-64, but that wasn't the case, and thus the venerable AMD Jaguar-based APUs are still in use to this day.

Also, the only console in the 1990s that used MIPS was the original PlayStation, and only the GameCube, Wii, and PS3 used PowerPC in the 2000s.
We should totally derail this thread with a discussion about how great the PowerPC-based Cell architecture was in the PS3, especially since it is your favourite subject. :D
 
ARM isn't low power. ARM is more efficient. So you can use that superior efficiency for providing (i) the same performance within smaller power envelope, or (ii) higher performance for the same power envelope, or (iii) a combination of both.

There are 250W ARM SKUs, and there is no architectural limits to making higher power SKUs (the limits are physical, i.e. the limits of cooling systems). ARM supports ATX standards, PCIE specifications, and all that. The reason why Nvidia and MediaTek are focusing on laptops is because that is what most people want. Desktop is a niche market.
ARM is certainly more efficient at lower power, but the higher the power envelope the smaller the performance gap becomes. I mean if you take the latest 10w ARM cores and put them against the latest 10w x86 cores from either Intel or AMD and those x86's get pummeled, you bring both those to 250w and suddenly that performance gap closes and now you find them basically trading back and forth. So yes ARM is very scalable and that is what makes it great but in the sub 15w power envelope is where it really stands out.
 
I believe that might be due to contracts in place with Sony and Microsoft, rather than AMD simply refusing to do so, since there is definitely a market for such an APU in a laptop.
Doesn't make me happier knowing that. Which really pisses me off that as consumers the new Ryzen APU's are basically updated 4000 series using Zen 3. That shows that either AMD doesn't care about other segments of the market or the Sony Microsoft money is really good.
That is starting to shift from what I have seen, as ARM is viable enough to be competitive with x86-64 with the Apple M1, though it would be nice to have more options.
Do governments even use Apple products?
I get what you are saying, though a C2D would be far too weak for even basic workstation and computer usage in this era with modern software.
Even a Dell Wyse thin client has far more powerful CPUs and iGPUs than even the best C2D or C2Q.
Like what? Unless you're in computer graphics of some sort, I don't see a reason to get anything more powerful?
Intel isn't dead, and it is still very profitable due to vendor lock-in and OEM contracts - this isn't going to hold them forever, though, and everyone was saying they would be back in the game with 10nm this year, and it isn't happening - in fact, performance went backwards, which I did not think would happen.
I kinda expect that from Intel as that's what a desperate person would do. It'll take time but sometime next year Intel will finally have their shit together, and hopefully use TSMC or Samsung for manufacturing because even if Intel does get 10nm working, it's still very far behind AMD's 7nm and Apple's 5nm. If Intel is still huffing paint and believes their 10nm is enough, then they deserve to die a financial death.
Apple using PowerPC was actually a very competitive move back in the early 1990s when m68k was at a total dead end, and it just wasn't feasible for IBM to continue to invest money into a CPU ISA that virtually were only included in top-end workstations with limited sales in 2005, so the choice to move to Intel was profitable for both Apple and IBM at that time.
I'm sure it helped that the Pentium 4 was a disaster. Much like today, nobody back then cared what AMD did because Intel is x86, even though Intel wasn't always on top of x86 performance.
ARM is going to have a bright future, and it may inadvertently be because of Intel's bumbling and mismanagement, much to their chagrin.
That's how I see Linux gaining any traction in the desktop market. We're basically waiting for Microsoft to fuck up. The difference here is that AMD needs to fuck up as well in order for ARM to see any future, and that future maybe RISC-V instead. Of course it could easily be that Intel and AMD don't fuck up and ARM is regulated to mobile devices and nothing more.

It would be better to have a more powerful GPU paired with a M1, rather than with the integrated GPU currently offered.
From the videos I have watched, games on the M1 have felt less bound by the CPU and far more by the GPU just by how the game is running consistently and inconsistently.
That's what I'm seeing as well. I think CPU performance won't matter much in the future, but GPU performance will, and making a good GPU is going to be a lot harder for Apple to do.
Also, the only console in the 1990s that used MIPS was the original PlayStation, and only the GameCube, Wii, and PS3 used PowerPC in the 2000s.
We should totally derail this thread with a discussion about how great the PowerPC-based Cell architecture was in the PS3, especially since it is your favourite subject. :D
N64, Dreamcast, and PS2 were all MIPS.
Apple has said it expects to complete the transition within the two years after announcement, so... 2022. I'm sure x86 Macs will continue to get support for years after that, but the hardware lineup will move over quickly. Remember, Apple switched to Intel at a similar pace in 2006-2007, and it didn't have as much experience back then.
The switch to x86 was something most Apple users welcome, while the switch to ARM won't be. Sure Apple will only carry ARM based devices, but I think sales will slump as a result. Me using ARM outside of anything that isn't Android is a huge pain in the ass. Even still, if I do buy another Raspberry Pi like device it won't be using ARM but x86 instead, just to avoid all the headaches I get with software. Even Android on ARM can be painful if the software isn't made by the manufacturer.

If Nvidia does make an ARM gaming device then I will be 100% dependent on them releasing updates and patching the software. I have no faith in manufacturers handling this sort of thing as evident that my routers either use DD-WRT or Open-WRT and my mobile devices use a custom rom like LineageOS or Corvus. If these names sound alien to you then welcome to the world that is ARM and the lack of support they receive. I have a Rapberry Pi 3 and was ready to finally enjoy using Ubuntu on it, only to find out that 64-bit support was impossible due to the lack of ram, and the GPU driver is still an unfinished mess.
The company isn't likely to lose market share, either. Gartner estimates that Apple's market share climbed 12.6% year-over-year in the first quarter of 2021 — and remember, that's despite virtually every other PC maker seeing a pandemic-related surge in sales, and despite most of Apple's lineup using somewhat outdated Intel tech.
Yea, and like most people that bought Apple products, those people will soon find out they made a mistake. There's a lot of people who see great reviews of the new Apple M1's, but those reviews aren't telling the whole story. I'm gonna be the guy telling an Apple M1 user the reason they can't play Cyperpunk 2077 is because Apple uses ARM. Then they'll me how Rosetta 2 is great and fixes x86, and I'll have to remind them that it's still emulation and emulation is always slower.
I know you think gaming is the center of the universe, but it turns out people just want... good laptops. Imagine that.
Yea well lots of people think porn has nothing to do with the internet, and they just want the ability to check their email and pay their bills faster, but we all know that's bullshit. I'm sure someone at Apple believes the same as you do, and yet most iOS devices are just toys for children. There are more people using an iOS device to play games than Playstation and Xbox combined. Find me an iOS device that doesn't have a single Bejeweled clone or some other popular mobile game on it. Then... there's the Apple Pippin which was Apple's attempt at one point to make a game console.

Come back to me in 2 years and lets see what happens to Apple's M1 sales when people realize they can't play all the games they could on the Intel Macs. Apple would need to go the route of Epic and just buy exclusives for their devices, but that isn't going to happen.

2012-11-18-pippin.jpg
 
Last edited:
That's what I'm seeing as well. I think CPU performance won't matter much in the future,
That's a fascinating theory and I bet people running payroll or ERPs or the like (just as a couple of OTTOMH examples) would disagree.
 
The switch to x86 was something most Apple users welcome, while the switch to ARM won't be. Sure Apple will only carry ARM based devices, but I think sales will slump as a result. Me using ARM outside of anything that isn't Android is a huge pain in the ass. Even still, if I do buy another Raspberry Pi like device it won't be using ARM but x86 instead, just to avoid all the headaches I get with software. Even Android on ARM can be painful if the software isn't made by the manufacturer.

If Nvidia does make an ARM gaming device then I will be 100% dependent on them releasing updates and patching the software. I have no faith in manufacturers handling this sort of thing as evident that my routers either use DD-WRT or Open-WRT and my mobile devices use a custom rom like LineageOS or Corvus. If these names sound alien to you then welcome to the world that is ARM and the lack of support they receive. I have a Rapberry Pi 3 and was ready to finally enjoy using Ubuntu on it, only to find out that 64-bit support was impossible due to the lack of ram, and the GPU driver is still an unfinished mess.
But why will it slump? ARM for the Mac isn't the same as for Linux and Windows. The "headaches" aren't really there; you can generally assume software and peripherals will work, and Rosetta-translated apps run well enough as a rule. If they don't work, there's usually a good reason for it (eGPUs are the biggest concern right now).

Besides, given that sales of some of Apple's most popular systems are surging right now, saying it won't be welcome flies in the face of evidence.


Yea, and like most people that bought Apple products, those people will soon find out they made a mistake. There's a lot of people who see great reviews of the new Apple M1's, but those reviews aren't telling the whole story. I'm gonna be the guy telling an Apple M1 user the reason they can't play Cyperpunk 2077 is because Apple uses ARM. Then they'll me how Rosetta 2 is great and fixes x86, and I'll have to remind them that it's still emulation and emulation is always slower.
Please drop the childish fan wars. Macs are fine.

How many Mac users do you know are playing Cyberpunk 2077 right now, or desperately wish they could? Not many — you'd need a pretty beefy iMac or Mac Pro just to run it at a decent clip. That does say something about Apple's lack of gaming-friendly hardware, but it also means that gaming matters approximately jack squat in terms of adoption. For that matter, why the hell would someone considering an ultraportable laptop of any kind, not just the MacBook Air or Pro, worry about whether or not they could play the latest AAA title? They're far more concerned that they'll have the battery life to last all day, that the keyboard feels good, or that it's responsive enough to juggle all their work apps.


Yea well lots of people think porn has nothing to do with the internet, and they just want the ability to check their email and pay their bills faster, but we all know that's bullshit. I'm sure someone at Apple believes the same as you do, and yet most iOS devices are just toys for children. There are more people using an iOS device to play games than Playstation and Xbox combined. Find me an iOS device that doesn't have a single Bejeweled clone or some other popular mobile game on it. Then... there's the Apple Pippin which was Apple's attempt at one point to make a game console.

Come back to me in 2 years and lets see what happens to Apple's M1 sales when people realize they can't play all the games they could on the Intel Macs. Apple would need to go the router of Epic and just buy exclusives for their devices, but that isn't going to happen.
Most iOS devices are phones and tablets that get used as phones and tablets. Having a game or two on your phone doesn't suddenly mean that gaming was an important factor in the buying decision. And an iPhone isn't a Mac, just as your Android phone isn't the same as your PC. There are different uses and expectations, especially for ultraportables where the performance isn't really there no matter whose system you buy.

I'll be happy to come back to it in two years, because the odds are that Apple's sales will have gone up (at least, relative to the post-pandemic slump all vendors are likely to face) thanks to renewed interest in its systems. I'm not so naive as to think Apple will suddenly grab a huge slice it didn't have before, but gaming is unlikely to hold Mac sales back in a significant way.
 
Doesn't make me happier knowing that. Which really pisses me off that as consumers the new Ryzen APU's are basically updated 4000 series using Zen 3. That shows that either AMD doesn't care about other segments of the market or the Sony Microsoft money is really good.
Let's hope it is the latter, but I won't hold my breath for any megacorp.
Do governments even use Apple products?
Not all of them, but many do, and when they do, nearly all workstations and mobile devices are Apple, at least from what I have seen.
Like what? Unless you're in computer graphics of some sort, I don't see a reason to get anything more powerful?
Even average usage can be a struggle on a C2D with Windows 10 and modern applications - Microsoft Teams would be a nightmare to run on such a CPU, even with a SSD and plenty of RAM.
It isn't that the Core 2 CPUs are bad, they are just heavily aged and no longer powerful enough to run modern/enterprise applications properly, even on average day of usage - they can do it, just not very well compared to even low-end or embedded offerings.
I kinda expect that from Intel as that's what a desperate person would do. It'll take time but sometime next year Intel will finally have their shit together, and hopefully use TSMC or Samsung for manufacturing because even if Intel does get 10nm working, it's still very far behind AMD's 7nm and Apple's 5nm. If Intel is still huffing paint and believes their 10nm is enough, then they deserve to die a financial death.
I sure hope they get their act together, and soon, or everyone is going to have to pay up to ARM/NVIDIA (without any heavy competition).
I'm sure it helped that the Pentium 4 was a disaster. Much like today, nobody back then cared what AMD did because Intel is x86, even though Intel wasn't always on top of x86 performance.
It did, but that didn't stop a zillion Pentium 4 workstations from being sold and put into production - much to my chagrin. :p
That's how I see Linux gaining any traction in the desktop market. We're basically waiting for Microsoft to fuck up. The difference here is that AMD needs to fuck up as well in order for ARM to see any future, and that future maybe RISC-V instead. Of course it could easily be that Intel and AMD don't fuck up and ARM is regulated to mobile devices and nothing more.
Even if AMD doesn't mess up, they are still giving a lot of competition with x86-64, and without them, ARM would have been the dominant CPU ISA much sooner.
I don't think ARM is going to stay exclusively in the mobile space forever, as they (excluding Apple's ARM) are gaining a lot of traction in the proprietary and non-proprietary server market, supercomputers, and niche workstations.
That's what I'm seeing as well. I think CPU performance won't matter much in the future, but GPU performance will, and making a good GPU is going to be a lot harder for Apple to do.
That could be, and even a modern x86-64 desktop CPU with 8-cores and a decent clock can drive any game engine properly as long as there is a powerful enough GPU paired with it.
N64, Dreamcast, and PS2 were all MIPS.
You're right, I forgot about N64 with its 64-bit MIPS CPU, and the PS2 was released in 2000, so it technically wasn't in production in the 1990s. ;)
The Dreamcast used a SuperH SH-4 CPU, though, which was using the SuperH ISA by Hitachi - there is a really great article on how the CPU and its ISA came to be with the 32X and Saturn.
The switch to x86 was something most Apple users welcome, while the switch to ARM won't be. Sure Apple will only carry ARM based devices, but I think sales will slump as a result. Me using ARM outside of anything that isn't Android is a huge pain in the ass. Even still, if I do buy another Raspberry Pi like device it won't be using ARM but x86 instead, just to avoid all the headaches I get with software. Even Android on ARM can be painful if the software isn't made by the manufacturer.
Software maturity on ARM is a big part of this, and that is exactly how I felt when I tried to migrate my servers entirely to ARM back in the mid-2010s when it was still 32-bit as well.
It was a great emerging architecture with a lot of features for the time, but the software and feature-sets just were not there yet.
If Nvidia does make an ARM gaming device then I will be 100% dependent on them releasing updates and patching the software. I have no faith in manufacturers handling this sort of thing as evident that my routers either use DD-WRT or Open-WRT and my mobile devices use a custom rom like LineageOS or Corvus. If these names sound alien to you then welcome to the world that is ARM and the lack of support they receive. I have a Rapberry Pi 3 and was ready to finally enjoy using Ubuntu on it, only to find out that 64-bit support was impossible due to the lack of ram, and the GPU driver is still an unfinished mess.
Yep, I have heard of those, especially for MIPS.
Oh, I know what you mean with the RP3 and any 64-bit OS; it can technically do it, but the lack of RAM made it pretty painful, though at least that was resolved with the RP4 4GB and 8GB models, hopefully with improved GPU drivers.

There are far more powerful ARM platforms out there than the RP3, but again, software support is going to be a bigger hurdle than the hardware itself already is - at least as of this post.
Yea, and like most people that bought Apple products, those people will soon find out they made a mistake. There's a lot of people who see great reviews of the new Apple M1's, but those reviews aren't telling the whole story. I'm gonna be the guy telling an Apple M1 user the reason they can't play Cyperpunk 2077 is because Apple uses ARM. Then they'll me how Rosetta 2 is great and fixes x86, and I'll have to remind them that it's still emulation and emulation is always slower.
Well, Rosetta 2 is technically a translation application, which is one level lower than emulation if I remember correctly.
Translation is much faster and more efficient, and requires far less work, than straight emulation like what Microsoft was doing with Windows RT.

While it isn't 1:1, it is very close in many use-case scenarios.
Also, I don't think many individuals would purchase any Apple computer to game on, as most all of their offerings are geared towards content creators and more niche production markets, which is absolutely what M1 excels at.

For gaming, though, not so much - agreed.
 
That's a fascinating theory and I bet people running payroll or ERPs or the like (just as a couple of OTTOMH examples) would disagree.
For gaming by itself, DukenukemX is correct.
For other programs, functions, databases, rendering, compiling, etc. - we always need more CPU processing power!
 
Back
Top