Leaks are trickling in but the Bionic A15 seems to be beating the Exynos 2200 for GPU performance

Lakados

[H]F Junkie
Joined
Feb 3, 2014
Messages
10,278
Not sure what Apple has been doing but the much-awaited Exynos 2200 which is the upcoming Samsung flagship paired with AMD mRDNA graphics, is apparently getting trounced by the Bionic A15 in GPU performance.

Exynos 2200 scores an impressive 127 - 170 on the Manhattan 3.1 test depending on thermal load, the A15 scores between 140 and 198

The A14 for reference on that test never managed to pass 120fps.

It would appear as though Apple might finally be doing something right with their GPUs, which makes me a little excited to see what they have yet to announce.

Somebody else published it in English
https://wccftech.com/a15-bionic-gpu-throttles-but-beats-a14-bionic-exynos-2200/
But it seems to be a breakdown of this guys tweets.
https://twitter.com/FrontTron/statu...ses-exynos-2200-even-when-throttled-06689834/
 
Last edited:
I'm not sure why this is surprising to anyone. It's been this way for years now, and Apple is only increasing the gap each year.
It's more that this year's flagship ARM chips are all sporting the new AMD RDNA 2 variant mRDNA graphics cores, and Apple while not being a GPU company seems to have bested them, soundly.

The Exynos 2200 though is no slouch and I really look forward to seeing what the various phone manufacturers are going to do with it, the Manhattan test is only a small picture and there are a lot of other application benchmarks that are more meaningful, but it is still a solid indicator of things to come.
 
It's more that this year's flagship ARM chips are all sporting the new AMD RDNA 2 variant mRDNA graphics cores, and Apple while not being a GPU company seems to have bested them, soundly.

The Exynos 2200 though is no slouch and I really look forward to seeing what the various phone manufacturers are going to do with it, the Manhattan test is only a small picture and there are a lot of other application benchmarks that are more meaningful, but it is still a solid indicator of things to come.
Just because it's an AMD fluxcapacitor whatever doesn't change the fact that it's them competing in a particular market that Apple has soundly been ahead of for years now, even more so when it comes to the GPU. This is why i'm excited to see how well this performance scales at desktop TDP levels.
 
Apple's mobile GPUs are pretty incredible in their power envelope. It's scaling that might make them truly impressive of the desktop.

Anyone who overclocks knows full well what it's like to double the power consumption of a CPU to get 10% more performance. Scaling is everything, and it can suck.
 
Apple's mobile GPUs are pretty incredible in their power envelope. It's scaling that might make them truly impressive of the desktop.

Anyone who overclocks knows full well what it's like to double the power consumption of a CPU to get 10% more performance. Scaling is everything, and it can suck.
Anyone who overclocks knows full well an Apple GPU has no chance on the desktop.
 
I couldn't care less about Apple's GPU, their software API support and developer treatment are abysmal.

Like dropping OpenGL (well deprecated, but still), never supporting Vulkan, only just a week ago finally enabling WebGL2 support in Safari, it's a joke.
 
I couldn't care less about Apple's GPU, their software API support and developer treatment are abysmal.

Like dropping OpenGL (well deprecated, but still), never supporting Vulkan, only just a week ago finally enabling WebGL2 support in Safari, it's a joke.
Hated working with OpenGL, such a pain for most things. I’m not overly upset with the lack of Vulkan support, MoltenVK does a good job at converting to Metal and Metal isn’t terrible to work with if you are doing something Apple native. But it’s not like most places work with Vulkan natively either most are using wrapper API’s ontop of Vulkan simply because Vulkan is just so…. meticulous. I’m honestly wondering if Microsoft is ever going to supply DX to other platforms, not necessarily open source it that I can’t see happening at all. But I could see a closed source Linux kit if for no other reason than to stifle the open source projects attempting to reverse engineer it.
 
Lots of people here and other places kept saying the gpu in the M1 wasn’t that impressive and couldn’t compete, and wouldn’t accept that Apple had built a capable gpu, for whatever reason, despite all the available data.

I can’t imagine anyone who has been actually paying attention will be surprised by this.

Of course Apple built a capable gpu. How else would they appeal to power users and creator types that were excited about Vega in MacBook Pro? By making the new ones slower?
 
So we can run 10 year old PC games ported to Apple Phone/ipad but not actually be able to play them on a phone because the control interface is still useless without a paired controller? Ipad, ok, maybe but if you're getting an ipad to game with 10 year old games, can't an Ultrabook do that too? I guess I'm curious who the market is for advanced gpu's on phones...ipads I start to see the allure....
 
So we can run 10 year old PC games ported to Apple Phone/ipad but not actually be able to play them on a phone because the control interface is still useless without a paired controller? Ipad, ok, maybe but if you're getting an ipad to game with 10 year old games, can't an Ultrabook do that too? I guess I'm curious who the market is for advanced gpu's on phones...ipads I start to see the allure....
Mobile gaming is increasingly important, particularly in places like China and India where you're more likely to game on your phone (as you probably can't afford a high-end gaming PC or separate console). That and certain non-gaming tasks will benefit from a faster GPU, such as photo and video capture/editing. You can see more effects in real time and sometimes process them faster.
 
Mobile gaming is increasingly important, particularly in places like China and India where you're more likely to game on your phone (as you probably can't afford a high-end gaming PC or separate console). That and certain non-gaming tasks will benefit from a faster GPU, such as photo and video capture/editing. You can see more effects in real time and sometimes process them faster.
And battery life. When you're doing video calls, GUI rendering, etc, the GPU really helps.
 
Race-to-sleep.


With a GPU you're interacting with in real-time?

No, this is just ammo for the 17 year-old spoiled gamer to tell his parents he needs this New Shiny (TM) (even if it's only like 10% faster)

The switch to RDNA2 is just acknowledging how much worse Mali is at this (and why everyone is excited about NVIDIA acquiring ARM, and cutting the dead-weight!
 
Lots of people here and other places kept saying the gpu in the M1 wasn’t that impressive and couldn’t compete, and wouldn’t accept that Apple had built a capable gpu, for whatever reason, despite all the available data.

I can’t imagine anyone who has been actually paying attention will be surprised by this.

Of course Apple built a capable gpu. How else would they appeal to power users and creator types that were excited about Vega in MacBook Pro? By making the new ones slower?
Isn't Apple still going with AMD's graphics for future products? Wasn't it leaked relatively recently that Apple is still using AMD RDNA2 graphics? Also isn't the Exynos 2200 not released yet? Also, isn't the A15 throttling from what I'm guessing heat? How's power efficiency?
 
Mobile gaming is increasingly important, particularly in places like China and India where you're more likely to game on your phone (as you probably can't afford a high-end gaming PC or separate console). That and certain non-gaming tasks will benefit from a faster GPU, such as photo and video capture/editing. You can see more effects in real time and sometimes process them faster.
Noted, I am of course coming from a very western-centric POV what with our consoles and our PC's and our disposable 75" TV's........good point.
 
Isn't Apple still going with AMD's graphics for future products? Wasn't it leaked relatively recently that Apple is still using AMD RDNA2 graphics? Also isn't the Exynos 2200 not released yet? Also, isn't the A15 throttling from what I'm guessing heat? How's power efficiency?
Apple still uses the W5000x and W6000x for their current workstation lineup, and I recall them announcing an update to the Afterburner cards. But who knows what Apple has in store for their next launch cycle.

And yes the Exynos 2200 was Samsung's shot for the performance crown, and it was designed to be squaring off against the A15, Apple just came out swinging is all.

From other sites I’ve seen they both throttle. Exynos unthrottled hits 170 vs the A15’s 190.
Fully throttled it’s 127 vs 140.
 
Last edited:
Noted, I am of course coming from a very western-centric POV what with our consoles and our PC's and our disposable 75" TV's........good point.
Yeah, Mobile gaming pulled in over $80B last year while PC and consoles combined brought in $82B, in North America, we see candy crush and the likes but in Asia and Europe, you get Geshin Impact, Black Desert Mobile. Not exactly light on the GPU side of things.
 
Noted, I am of course coming from a very western-centric POV what with our consoles and our PC's and our disposable 75" TV's........good point.
The mobile gaming market in China, Hong Kong and Taiwan is particularly interesting to watch. You see brands like ASUS, Lenovo and ZTE releasing these ridiculously over-the-top phones with 18GB of RAM, dual USB charging ports, 160Hz refresh rates... anything to help that FPS or MOBA run a little faster, or to keep playing for a little longer. The rationale, I imagine, is that spending $200 more on your phone beats spending $1,000 on a computer you won't use very often.
 
The mobile gaming market in China, Hong Kong and Taiwan is particularly interesting to watch. You see brands like ASUS, Lenovo and ZTE releasing these ridiculously over-the-top phones with 18GB of RAM, dual USB charging ports, 160Hz refresh rates... anything to help that FPS or MOBA run a little faster, or to keep playing for a little longer. The rationale, I imagine, is that spending $200 more on your phone beats spending $1,000 on a computer you won't use very often.
That covers half of a desktop GPU these days. Going to need more scratch to get the rest of the stuff.
 
That covers half of a desktop GPU these days. Going to need more scratch to get the rest of the stuff.
I'm thinking of both pricing differences in the region and what someone is likely to consider reasonable given local income levels. Might need to be higher, but I also don't see a typical Shanghai resident dropping the equivalent of $2K in addition to buying a phone.
 
Hated working with OpenGL, such a pain for most things. I’m not overly upset with the lack of Vulkan support, MoltenVK does a good job at converting to Metal and Metal isn’t terrible to work with if you are doing something Apple native. But it’s not like most places work with Vulkan natively either most are using wrapper API’s ontop of Vulkan simply because Vulkan is just so…. meticulous. I’m honestly wondering if Microsoft is ever going to supply DX to other platforms, not necessarily open source it that I can’t see happening at all. But I could see a closed source Linux kit if for no other reason than to stifle the open source projects attempting to reverse engineer it.
Using a wrapper API in your code does not mean the compiled result is not using Vulkan natively right ? (could be misunderstanding what your saying, but someone using an VulkanSceneGraph API is still making a program that use Vulkan natively.
 
Using a wrapper API in your code does not mean the compiled result is not using Vulkan natively right ? (could be misunderstanding what your saying, but someone using an VulkanSceneGraph API is still making a program that use Vulkan natively.
It’s using Vulkan natively but it’s a very complicated API, and like you would expect from a low level API there is no hand holding and you must be exact with every little command. As such most developers don’t actually code in Vulkan directly as it’s far too time consuming and too difficult to troubleshoot, they use an API that they code too that is operating at a higher level and let it translate that down. Glut was an example of such a library for OpenGL back in the day (that’s the one the place I was working for at the time used).

But there are many development tool sets now that you aren’t coding for DX12U, Vulkan, or Metal. It uses generic calls that are proprietary to that tool set and it can freely translate between any and all of them with the check of a few boxes. So for large studios the platform they are targeting is arbitrary as they build something press a few buttons and it’s gets compiled to the desired profile.
 
The mobile gaming market in China, Hong Kong and Taiwan is particularly interesting to watch. You see brands like ASUS, Lenovo and ZTE releasing these ridiculously over-the-top phones with 18GB of RAM, dual USB charging ports, 160Hz refresh rates... anything to help that FPS or MOBA run a little faster, or to keep playing for a little longer. The rationale, I imagine, is that spending $200 more on your phone beats spending $1,000 on a computer you won't use very often.
The mobile market is big because a lot of people around the world can't afford to buy console games. It's not that it's better or anything but it's free and phones costing the equivalent of ~$200 will allow someone to play games while also buying something you absolutely need today. Though I wonder how the piracy situation is for both mobile and console/PC gaming? If you broke then piracy is on the table.
 
Isn't Apple still going with AMD's graphics for future products? Wasn't it leaked relatively recently that Apple is still using AMD RDNA2 graphics? Also isn't the Exynos 2200 not released yet? Also, isn't the A15 throttling from what I'm guessing heat? How's power efficiency?

Literally nothing you said here had any relevance to what I posted. Apple can make a good gpu and for some reason that’s… not a good thing?

Did you have a point to make?
 
Literally nothing you said here had any relevance to what I posted. Apple can make a good gpu and for some reason that’s… not a good thing?

Did you have a point to make?
Do you?

Apple built a GPU based on Imagination's PowerVR technology so of course it'll be good. Though one can't praise Apple without knowing what kind of RDNA2 is used in the Exynos 2200. It's not even out yet and we're declaring victory. It'll be out next year. The Exynos 2200 has 384 stream processors which may or may not be their high end model. We don't even know who will consume more power. Also just to point out the Exynos SoC's don't typically come in Samsung American model phones. For some reason we end up with Qualcomm versions anyway. The fact that both SoC's throttle is kinda disturbing since their extra performance will seem pointless.
 

Yes! Apple can make a good GPU and that’s okay.

I don’t understand why so many people have a problem with that, though. More competition is good for us.

Doesn’t effectively matter if we can’t buy anything, of course, but the alternative is just everybody using whatever Qualcomm gives them and Samsung and google and whoever else not bothering to try and roll their own to compete. And we all see where that left us - flagship Qualcomm SOCs in 2021 that are finally catching up to the A12 from 2018.
 
The mobile market is big because a lot of people around the world can't afford to buy console games. It's not that it's better or anything but it's free and phones costing the equivalent of ~$200 will allow someone to play games while also buying something you absolutely need today. Though I wonder how the piracy situation is for both mobile and console/PC gaming? If you broke then piracy is on the table.
Oh, that's absolutely true as well. And it's great... it democratizes gaming in a way that PCs and even consoles don't allow.

Piracy is rampant on Android; I'm not sure how consoles are faring, but it's considerably less common on iOS. I still remember how some games were (and likely still are) ad-supported on Android where they were paid but ad-free on iOS, as they knew Android users were much more likely to steal apps.
 
Oh, that's absolutely true as well. And it's great... it democratizes gaming in a way that PCs and even consoles don't allow.

Piracy is rampant on Android; I'm not sure how consoles are faring, but it's considerably less common on iOS. I still remember how some games were (and likely still are) ad-supported on Android where they were paid but ad-free on iOS, as they knew Android users were much more likely to steal apps.
Piracy is on iOS but it’s not like it used to be since it’s so locked down now. It’s not like the old days where you could just visit a website and jailbreak your phone.
 
Honestly, I fail to see why this matters.

My email and web browsing on my phone will go the same speed either way :p

Not sure if it is part of the gpu benchmarks, but the tensor cores are going to change computing in a way that will make big data processing that is currently done in-cloud look like Babbage's difference engine.

Apple's recently announced applications in detecting illegal images and diagnosing depression in its users are just the tip of an iceberg that has been weighed down by extra rocks ... lots of them.
 
Not sure if it is part of the gpu benchmarks, but the tensor cores are going to change computing in a way that will make big data processing that is currently done in-cloud look like Babbage's difference engine.

Apple's recently announced applications in detecting illegal images and diagnosing depression in its users are just the tip of an iceberg that has been weighed down by extra rocks ... lots of them.

Great, just great.

I wish we could ban all this shit.

Ban any and all collection, use and analysis of personal data for any purpose.

Big data must die.
 
Great, just great.

I wish we could ban all this shit.

Ban any and all collection, use and analysis of personal data for any purpose.

Big data must die.
Talk to our legislative branch. They’re the ones passing laws requiring companies like Apple to do this. Apple/google have no option here. If they don’t comply the federal government will force them into compliance very quickly.
 
Yes! Apple can make a good GPU and that’s okay.

I don’t understand why so many people have a problem with that, though. More competition is good for us.
Apple's GPU's are mostly Imagination's PowerVR GPU's. When Apple removed Imagination and acquired some of their employee's they also pushed Imagination to be bought by some Chinese company. PowerVR isn't dead but they don't have much going for them nowadays, and will likely not see the same progress from Imagination as we did in the past.

We didn't gain a competitor when we lost Imagination's PowerVR to some Chinese company while Apple poaching their engineers and licenses their tech. Bad enough that when Nvidia buys ARM that we may lose Mali graphics as well.
Doesn’t effectively matter if we can’t buy anything, of course, but the alternative is just everybody using whatever Qualcomm gives them and Samsung and google and whoever else not bothering to try and roll their own to compete. And we all see where that left us - flagship Qualcomm SOCs in 2021 that are finally catching up to the A12 from 2018.
The fact that Apple and Qualcomm have to rebuild their own ARM design shows that ARM has stagnated. Doesn't help that ARM is broke and that their only saving grace is Nvidia to buy them. ARM and Imagination going broke isn't more competition. Everyone catching up to Apple has a lot to do with companies like Apple not paying enough royalties to companies like ARM and Imagination, who were the staple to mobile tech. It's gotten so bad that AMD has thrown their hat into the race, but we went from Mali+Nvidia+PowerVR+Adreno to what will soon be Nvidia+AMD+(Apple)<-- with Apple being exclusive to themselves.
 
Not sure if it is part of the gpu benchmarks, but the tensor cores are going to change computing in a way that will make big data processing that is currently done in-cloud look like Babbage's difference engine.

Apple's recently announced applications in detecting illegal images and diagnosing depression in its users are just the tip of an iceberg that has been weighed down by extra rocks ... lots of them.
Sending known child porn will give a notice to the users that the content is inappropriate; and saying you want to kill yourself with Siri around will have her suggest suicide hotline.

Are those the things you are referring to?
 
Sending known child porn will give a notice to the users that the content is inappropriate; and saying you want to kill yourself with Siri around will have her suggest suicide hotline.

Are those the things you are referring to?
No. I am referring to the use of on-device ai to attempt to detect modified child porn images and diagnose depression from user's behaviour patterns.

Over at google: I suspect they are going to use the technology to do on-device user analysis for their cookie-replacement tech.
 
Last edited:
Back
Top