Apple opens the App Store to retro game emulators

polonyc2

Fully [H]
Joined
Oct 25, 2004
Messages
25,862
Apple is loosening its App Store restrictions and opening the marketplace up to retro game emulators...in an update on Friday, Apple announced that game emulators can come to the App Store globally and offer downloadable games...Apple says those games must comply with "all applicable laws" though — an indication it will ban apps that provide pirated titles...

https://www.theverge.com/2024/4/5/24122341/apple-app-store-game-emulators-super-apps
 
  • Like
Reactions: Nobu
like this
an indication it will ban apps that provide pirated titles...

Provide, or allow one to run?

It's funny to me how they claim to need to control this, when if anything else goes wrong with a 3rd party app in the app store they are all like "well that's not our problem, look at the app developer".

Walled gardens must die.

Compatibility aside, the only person who should be able to decide what does and does not run on any given device should be the person who owns that device. And I know they want us to not own anything and just pay them rental fees, but the owner today is still the person who bought the damn phone, and that's not about to change nay time soon.

It was because of walled gardens I abandoned Apple in 2012.

I got the first iPhone in 2007 shorty after launch. I considered it unnecessary and wasn't going to get one, but then my ex bought one, and I was all like "well, if she is going to own one, I am going to be a bit jealous, so I am going to have to buy one as well."

I remember thinking $600 was a crazy and irresponsible amount of money to spend on a phone. How times have changed :D

Then the next year with a AT&T contract re-up, I got a "too good to pass up" deal to upgrade to an iPhone 3g.

Two years later I re-upped again and got the iPhone 4.

But at this time I was really starting to sour on Apple.

At the time, you needed to set up and sync the iPhone using iTunes on a computer. There was none of the modern OTA stuff.

I had pretty much entirely switched over to Linux except for games, and there was no Linux iTunes client. It also really did not like running under Wine.

I hated how I couldn't just drop mp3 files on it like a storage device and have it play them, and it had to over-complicate things by syncing with iTunes and creating unnecessary data database shit instead of just reading my mp3 files from a file structure, reading the ID3 tags and playing them

And then there was the walled garden.

I kept jailbreaking, and then there was always a new update, and then a jailbreak wouldn't work for months, until a new one came around, and then the next update broke it.

It got to the point where more often than not I couldn't use my phone for the things I wanted to use it for, so it was time for a change.

I ditched both Apple and AT&T, and got a Samsung Galaxy SIII on Verizon. It was just OK, until I was amble to get rid of all the Samsung junk software on it, and flash it with CyanogenMod, at which point it became my favorite phone to date.

After that I had an LG G2, great phone, with some strange Korean-inspired aesthetics in the UI.

After that I had a Motorola Droid Turbo. It was awesome, but is the only phone to date, I've experienced screen burn-in on. Early OLED issues I guess.

Verizon eventually also turned out to be a pain in the ass though. The Galaxy S3 turned out to be an exception. Most other phones couldn't even be rooted. Locked bootloaders, etc. etc. you name it. Verizon didn't want me to be able to use my phone the way I wanted to. So eventually I ditched them as well.

I briefly used the Intel x86 phone, the Asus Zenfone 2. I actually really liked it, but then moved bought a Nexus 5x and signed up for Google Project Fi, where I have been ever since.

The Nexus 5x gave way to a first gen Pixel, which gave way to a Pixel 3, and a Pixel 5a

I liked the Nexus line and the original Pixel, but as time goes on they keep doing things that piss me off. Getting rid of analog audio ports, integrating search into the home screen and menus (instead of just staying in the browser like on my PC) adding assistants, more and more unwanted cloud sync, and now AI bullshit.

Most of this started phasing in more and more during my 5a ownership, and as each new thing came along I did my best to block and disable it, including going through every single setup option and disabling anything and everything backup, sync or cloud, as well as anything AI, anything face recognition, anything voice control etc. etc.

My Pixel 5a bricked itself last month, and I dreaded its replacement. I didn't want even more AI/Assistant/Cloud crap, and I also hated that they removed the audio port and my favorite feature, the rear fingerprint scanner. I had neglected to cancel my $6 per month phone protection, so I just decided to see what they would send me as a replacement.

I wound up with a 7 Pro. I don't like it, but once I essentially turn it into a Nexus phone by stripping out everything that makes it a Pixel through disabling and replacing apps, the launcher, and settings related to AI, it is "ok". It's a bit too big for one hand typing (and the one hand mode keyboard is awkward at best) and over a month later I still really miss the rear fingerprint scanner (the under the glass one is just not as useful, as you can't do it by feel without looking at it, and it is not as reliable). Without a case it is also top heavy and slippery as hell, resulting in butterfinger phone drops. I don't like it, but I also can't think of any other phone on the market I wouldn't also dislike, so I am just keeping it. :/

I'll probably be in the same situation when this phone bricks itself and I get my next replacement. Phone development has just been going backwards for a decade now. I like nothing that is new on the market, and wish for a time with swappable batteries, mini-sd storage slots, audio ports, less bloat, AI junk, always listening nonsense, spyware, data collection etc. etc.

Back 15 years ago I used to say "I want my phone to be more like my computer, NOT my computer to be more like my phone" but now even that seems lost. Windows 10 and 11 are more like phones than ever, and the phones are worse.

Tech after ~2007-2010 some time has just been a slow motion slippery slope into crap.


Holy shit did I digress. Sorry about that.
 
Now they just need to allow apps related to torrents/usenet (and/or open sideloading up outside of the EU)
 
no shit!? i aint readin all that.

yeah i can see nintendo pouncing on this pretty quick, guess we'll see...
Nintendo will likely pounce on the emulators themselves but not Apple. But Apple will likely be fast to pull down apps that draw too much heat from the big 3.
 
perhaps it's finally time to cut my exceedingly high-quality hobby Gameboy emulator loose on apple platforms

they won't know what hit em

1712361351965.png
1712361437764.png
 
I wish they'd build a console powered by M3 and let Arcade compete with Steam already! 🙃
 
I wish they'd build a console powered by M3 and let Arcade compete with Steam already! 🙃
It would be interesting to see what something like that could do.

It would either need to emulate x86, or persuade developers to port to yet another platform, but both of those are doable.

Apples GPU processing has repeatedly fallen short though. From what I can quickly find, the M3 Max is ~25% faster than the M2 Max.

In the few cross-platform graphics tests that exist, (ugh, Geekbench, I hate Geekbench) the M2 Max performed at about a third of the performance of a 4090. That would put an M3 Max at ~40% of a 4090.

The M3 Pro performs at about 38% of the M3 Max, which means it would perform ~16% as fast as a 4090.

The regular M3 performs at about 25% of an M3 Max, which places it at about 10% of a 4090.


So, if we compare these performance levels with GPU's out there:

M3 Max ~ RTX 4060 Ti
M3 Pro ~ Somewhere between a GTX 1060 and a GTX 1070 (closer to 1060)
M3 ~ Somewhere between a GTX 1050 ti and GTX 1060. (closer to the 1050 ti)

For comparison I just gave away a 1070 based system for free to the kid of a friend, because it was collecting dust, wasn't worth the effort of selling, and the kid could really use it.

Geekbench isn't exactly a great comparison, and these are indirect comparisons, so take it with a pinch of salt, but its what I could find, and at least it paints some kind of picture of the raw power of the GPU component in Apple's M Series chips.

I would guess that based on how Apple prices systems with the M3 Max, it would be WAY to expensive to be competitive in the console market, and the M3 Pro and M3 would not be able to keep up on the GPU side with the latest gen (and cost more)
 
It would be interesting to see what something like that could do.

It would either need to emulate x86, or persuade developers to port to yet another platform, but both of those are doable.

Apples GPU processing has repeatedly fallen short though. From what I can quickly find, the M3 Max is ~25% faster than the M2 Max.

In the few cross-platform graphics tests that exist, (ugh, Geekbench, I hate Geekbench) the M2 Max performed at about a third of the performance of a 4090. That would put an M3 Max at ~40% of a 4090.

The M3 Pro performs at about 38% of the M3 Max, which means it would perform ~16% as fast as a 4090.

The regular M3 performs at about 25% of an M3 Max, which places it at about 10% of a 4090.


So, if we compare these performance levels with GPU's out there:

M3 Max ~ RTX 4060 Ti
M3 Pro ~ Somewhere between a GTX 1060 and a GTX 1070 (closer to 1060)
M3 ~ Somewhere between a GTX 1050 ti and GTX 1060. (closer to the 1050 ti)

For comparison I just gave away a 1070 based system for free to the kid of a friend, because it was collecting dust, wasn't worth the effort of selling, and the kid could really use it.

Geekbench isn't exactly a great comparison, and these are indirect comparisons, so take it with a pinch of salt, but its what I could find, and at least it paints some kind of picture of the raw power of the GPU component in Apple's M Series chips.

I would guess that based on how Apple prices systems with the M3 Max, it would be WAY to expensive to be competitive in the console market, and the M3 Pro and M3 would not be able to keep up on the GPU side with the latest gen (and cost more)
I'm thinking ports running natively and a console. Around the price of a PS5 maybe less. They don't need crazy hardware profits, the store would take care of that.
I don't game but in video editing, the M3 Max with 128GB has been a game changer. My Threadripper rig hardly gets turned on now and my power bill shows it!
 
I'm thinking ports running natively and a console. Around the price of a PS5 maybe less. They don't need crazy hardware profits, the store would take care of that.
I don't game but in video editing, the M3 Max with 128GB has been a game changer. My Threadripper rig hardly gets turned on now and my power bill shows it!
These are not the same use scenarios. Apples M-series of chips have built in hardware acceleration of ProRes and other codecs as well as video editing aspects of graphics as well. (Crazy what yo can do when vertically integrated, huh? Own the software design, operating system design, and the hardware design and you can set certain priorities for markets you want to corner.

Back in the day you used to be able to buy add-on boards for that kind of stuff for PC's as well, but for whatever reason it is less common these days, which has allowed Apple a niche when it comes to Video editing on their M-series chips.

These things tend to work pretty well when you focus on hardware accelerating a few things, but it is not a strategy that can be expanded ad infinitum. At some point, if you add special purpose portions of the chip to deal with hardware acceleration of everything under the sun, it becomes less efficient than just doing it in software.

Your experience with Video Editing - however - will have absolutely no impact on gaming. These are different tasks, and most computers have hardware accelerated gaming to one extent or another. It's called a GPU.

Apples GPU's are fine for the type of "creative" workstation use they market them for, but in general a little underwhelming for most games. As demonstrated above, the M3 Max in all of its glory performs like a mid to low end latest gen discrete GPU on a PC for 3d rendering purposes. The Pro and regular M3 are far, far behind that.

In this case its a matter of the right tool for the right job. Due to Apples approach, if you do video editing in 2021 - 2024 (and probably for the forseeable future) an Apple M-series chip is probably the right choice (albeit a little pricy).

If your workload involves 3D Rendering, gaming or more general purpose (software, not hardware accelerated) workloads, you are almost always going to be better off with an equivalently priced PC.

The M3 Maxx would be the only current Apple chip viable for gaming, and it would be a mid-tier experience at best. And for that it just wouldn't be financially viable. That chip just costs too darn much, and has WAY too much stuff in it that would go partly or wholly unused and wasted on a gaming system.

Apple likely could design a chip based on their M-series architecture, with a larger monster GPU, and less of the other Apple-specific stuff, but it wouldn't be a "just use what you have" approach. It would require a complete redesign, and special purpose silicon. And that just isn't Apples priority, so I don't think it will happen.

Otherwise you'd probably be stuck with an M3 Pro, which would be a bit slower in the graphics department than a PS5, or the muc slower M3, asn the M3 Maxx would be way too expensive for that role.

I mean, maybe they could just put together something for super-casual gaming using a phone chip like the A17, That would be more realistic, but still probably not competitive. Why put the chip in a $550 console, when they could put it in a $1,200 to $1,600 smartphone?

It just doesn't make financial sense.
 
Last edited:
These are not the same use scenarios. Apples M-series of chips have built in hardware acceleration of ProRes and other codecs as well as video editing aspects of graphics as well. (Crazy what yo can do when vertically integrated, huh? Own the software design, operating system design, and the hardware design and you can set certain priorities for markets you want to corner.

Back in the day you used to be able to buy add-on boards for that kind of stuff for PC's as well, but for whatever reason it is less common these days, which has allowed Apple a niche when it comes to Video editing on their M-series chips.

These things tend to work pretty well when you focus on hardware accelerating a few things, but it is not a strategy that can be expanded ad infinitum. At some point, if you add special purpose portions of the chip to deal with hardware acceleration of everything under the sun, it becomes less efficient than just doing it in software.

Your experience with Video Editing - however - will have absolutely no impact on gaming. These are different tasks, and most computers have hardware accelerated gaming to one extent or another. It's called a GPU.

Apples GPU's are fine for the type of "creative" workstation use they market them for, but in general a little underwhelming for most games. As demonstrated above, the M3 Max in all of its glory performs like a mid to low end latest gen discrete GPU on a PC for 3d rendering purposes. The Pro and regular M3 are far, far behind that.

In this case its a matter of the right tool for the right job. Due to Apples approach, if you do video editing in 2021 - 2024 (and probably for the forseeable future) an Apple M-series chip is probably the right choice (albeit a little pricy).

If your workload involves 3D Rendering, gaming or more general purpose (software, not hardware accelerated) workloads, you are almost always going to be better off with an equivalently priced PC.

The M3 Maxx would be the only current Apple chip viable for gaming, and it would be a mid-tier experience at best. And for that it just wouldn't be financially viable. That chip just costs too darn much, and has WAY too much stuff in it that would go partly or wholly unused and wasted on a gaming system.

Apple likely could design a chip based on their M-series architecture, with a larger monster GPU, and less of the other Apple-specific stuff, but it wouldn't be a "just use what you have" approach. It would require a complete redesign, and special purpose silicon. And that just isn't Apples priority, so I don't think it will happen.

Otherwise you'd probably be stuck with an M3 Pro, which would be a bit slower in the graphics department than a PS5, or the muc slower M3, asn the M3 Maxx would be way too expensive for that role.

I mean, maybe they could just put together something for super-casual gaming using a phone chip like the A17, That would be more realistic, but still probably not competitive. Why put the chip in a $550 console, when they could put it in a $1,200 to $1,600 smartphone?

It just doesn't make financial sense.

There should be an M3 Ultra down the line, and that will get considerably closer to 4090 performance… but you’re probably going to pay a steep premium for that, well beyond the console realm.

I will say that Macs excel at more than just video editing… really, they’re superb for media creation in general.
 
Even an Apple TV with an A18X, and try to have Arcade compete against Nintendo.

These are not the same use scenarios. Apples M-series of chips have built in hardware acceleration of ProRes and other codecs as well as video editing aspects of graphics as well. (Crazy what yo can do when vertically integrated, huh? Own the software design, operating system design, and the hardware design and you can set certain priorities for markets you want to corner.

Back in the day you used to be able to buy add-on boards for that kind of stuff for PC's as well, but for whatever reason it is less common these days, which has allowed Apple a niche when it comes to Video editing on their M-series chips.

These things tend to work pretty well when you focus on hardware accelerating a few things, but it is not a strategy that can be expanded ad infinitum. At some point, if you add special purpose portions of the chip to deal with hardware acceleration of everything under the sun, it becomes less efficient than just doing it in software.

Your experience with Video Editing - however - will have absolutely no impact on gaming. These are different tasks, and most computers have hardware accelerated gaming to one extent or another. It's called a GPU.

Apples GPU's are fine for the type of "creative" workstation use they market them for, but in general a little underwhelming for most games. As demonstrated above, the M3 Max in all of its glory performs like a mid to low end latest gen discrete GPU on a PC for 3d rendering purposes. The Pro and regular M3 are far, far behind that.

In this case its a matter of the right tool for the right job. Due to Apples approach, if you do video editing in 2021 - 2024 (and probably for the forseeable future) an Apple M-series chip is probably the right choice (albeit a little pricy).

If your workload involves 3D Rendering, gaming or more general purpose (software, not hardware accelerated) workloads, you are almost always going to be better off with an equivalently priced PC.

The M3 Maxx would be the only current Apple chip viable for gaming, and it would be a mid-tier experience at best. And for that it just wouldn't be financially viable. That chip just costs too darn much, and has WAY too much stuff in it that would go partly or wholly unused and wasted on a gaming system.

Apple likely could design a chip based on their M-series architecture, with a larger monster GPU, and less of the other Apple-specific stuff, but it wouldn't be a "just use what you have" approach. It would require a complete redesign, and special purpose silicon. And that just isn't Apples priority, so I don't think it will happen.

Otherwise you'd probably be stuck with an M3 Pro, which would be a bit slower in the graphics department than a PS5, or the muc slower M3, asn the M3 Maxx would be way too expensive for that role.

I mean, maybe they could just put together something for super-casual gaming using a phone chip like the A17, That would be more realistic, but still probably not competitive. Why put the chip in a $550 console, when they could put it in a $1,200 to $1,600 smartphone?

It just doesn't make financial sense.

What I've seen with A series mobile chips, it would be more than adequate. Remember it's dedicated for gaming and not a PC that can do other tasks. Heat issues would be far easier to control so clocks will stay locked at highest levels and there wouldn't be a display just a mm away heating things up.
As for mobile, truly mobile workstation duties, unless one is strictly tethered to MS Win, the high end Macbook Pros of current simply have no competition regardless of price. I have the PC equivalent with 16TB (dual 8TB gen4), 128GB DDR5 5600, highest end GPU available along with its 350W(!) power adapter and it's a hot, noisy mess! The Mac is silent the vast majority of the time and has no performance drop when not connected to mains power. I don't plan on upgrading until OLEDs come out but side by side out of the box, the miniLED is brighter and has more accurate color than the Lenovo and Dell OLEDs. The ever so slight deficiency of the latter is no dealbreaker of course.

As for AppleTV, I always thought they were underpowered. Not sure if its TVOS or something else. But if they did come out with say AppleTV Arcade edition with tweaks to appease to gamers, they just may have something.

I'd much rather see development in that direction than $3.5k snorkel goggles that strain the hell out of my neck and cause bone grinding headaches. JMHO, of course.

Another thing that would make me upgrade before OLED would be the inclusion of a cellular radio on the Macbook Pros. I use a Thinkpad X1 Yoga fully spec'd out and that feature is incredibly convenient. With a 5G account in the safety net/first responder/utility industry, there's times where speeds and response are comparable to my home fiber connection! I do like that laptop and it's fast but like a racecar that passes everything on the road but a gas station, it's not without drawbacks when power isn't available.
 
Justifying this change as a way to compete with Nintendo is exactly the wrong play. If anything it could be seen as a shot across the bow at Nvidia and everyone else.
 
Apple already removed the first GBA emulator due to the use of source code from another emulator called GBA4iOS. Weirdly the GBA4iOS was developed 10 years ago by high school students at the time.
https://arstechnica.com/gaming/2024...-emulator-released-under-new-app-store-rules/
They violated his license, which requires written approval to fork his code to the Apple Store, good on Apple for taking it out quickly at the original author's request.

GBA4iOS is an open source program released under the GNU GPLv2 license, with licensing terms that let anyone "use, modify, and distribute my original code for this project without fear of legal consequences." But those expansive licensing terms only apply "unless you plan to submit your app to Apple’s App Store, in which case written permission from me is explicitly required."
 
What I've seen with A series mobile chips, it would be more than adequate. Remember it's dedicated for gaming and not a PC that can do other tasks. Heat issues would be far easier to control so clocks will stay locked at highest levels and there wouldn't be a display just a mm away heating things up.
As for mobile, truly mobile workstation duties, unless one is strictly tethered to MS Win, the high end Macbook Pros of current simply have no competition regardless of price. I have the PC equivalent with 16TB (dual 8TB gen4), 128GB DDR5 5600, highest end GPU available along with its 350W(!) power adapter and it's a hot, noisy mess! The Mac is silent the vast majority of the time and has no performance drop when not connected to mains power. I don't plan on upgrading until OLEDs come out but side by side out of the box, the miniLED is brighter and has more accurate color than the Lenovo and Dell OLEDs. The ever so slight deficiency of the latter is no dealbreaker of course.

As for AppleTV, I always thought they were underpowered. Not sure if its TVOS or something else. But if they did come out with say AppleTV Arcade edition with tweaks to appease to gamers, they just may have something.

I'd much rather see development in that direction than $3.5k snorkel goggles that strain the hell out of my neck and cause bone grinding headaches. JMHO, of course.

Another thing that would make me upgrade before OLED would be the inclusion of a cellular radio on the Macbook Pros. I use a Thinkpad X1 Yoga fully spec'd out and that feature is incredibly convenient. With a 5G account in the safety net/first responder/utility industry, there's times where speeds and response are comparable to my home fiber connection! I do like that laptop and it's fast but like a racecar that passes everything on the road but a gas station, it's not without drawbacks when power isn't available.

The Mac isn't silent. It's just inaudible most of the time. I assure you that with a proper microphone I can prove that your Mac isn't silent.
 
The Mac isn't silent. It's just inaudible most of the time. I assure you that with a proper microphone I can prove that your Mac isn't silent.
Of course no computer is intrinsically silent if you want to go there.
But out of all of the computers I use, it is the least noticeable by far. And very far!
Just having an Intel laptop checking for updates gets the fan very noticeable using the balanced power profile and if set to performance it's loud enough to be heard in another room.
The M3 Max is on an entirely different level.
 
Back
Top