They ditched that, are you assuming that most don't know?
They did ditch Imagination. Their GPU is still a derivative of that though.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
They ditched that, are you assuming that most don't know?
Personally, the biggest issue i see is Bootcamp/Paralells/Vmware Fusion support for Windows apps. That is probably the biggest dealbreaker of all.
It's only Intel's older offerings .
Can laugh about a A12 doing 4k, but Apple has been running right around 1440 for a long time now... a bit more or less depending on the form factor.
I don't think it's quite Crysis level graphics yet, but it's also not shabby or running on a 1000W PSU SLI rig either.
Also, folks may be making the assumption that an Apple ARM Desktop would continue to use the derivative Imagination graphics. Most games are not CPU bound, particularly at 4k. Using an integrated GPU wouldn't necessarily be the case; there would be no reason they couldn't go to a much more powerful GPU, and have had pretty strong relationships with both AMD and nVidia in the past.
Well, I highly doubt the existing A11 can just scale up and scale out to desktop-level power consumption and still manage good performance. It's not designed to do so, and it'll never be able to manage it as is.
Look at Nvidia and their GPU design philosophies. When they stuck with monolithic big dies, throwing more transistors and power at the problem and try to scale down, it failed fantastically. Fermi came to mind. And then they went back to the design board with Maxwell and design specifically to be able to scale across extreme power envelopes, while maintaining performance scaling. It look them 3 tries to get it to work well (Maxwell was a good start, Kepler was a good improvement, but Pascal took it to another level).
Intel also had a stint with NetBurst where they tried to just throw more power at the problem. They also had to scrap the whole ordeal.
Now you tell me that Apple had the secret sauce to the problem all along, while 2 semiconductor giants were clueless? I'm at least skeptical, and laughing a little at the notion.
Apple have a long history of not rushing products and releasing them when they are good and ready. Just because they havent done it yet, doesnt mean they dont have an ace in the whole. Maybe they have spent all this time preparing the framework and ecosystem to support it rather than launching a half assed product like most companies will.
Well, I don't doubt that they are very careful about it. However, I do doubt their CPU capabilities, due to the myriads of unknowns still out there, biggest 2 are the frequency/power curve, and the absolute frequency ceiling of their A12 chips. Another one I have doubts about is actually the tests themselves, since the results so far are all apples to oranges to me. Too many compromises to call it a fair comparison.
I can see a future, maybe as early as next year, where they drop the Air and promote the iPad Pro to be its replacement. All they need is a dock mechanism like with the surface, and full blown Mac OS with their proprietary software on it. That'll work for the Air folks, absolutely. People doing office work that can be done with Apple's office apps will absolutely love the 15+ hours battery life that it'll offer.
Or go with MS and Razer's route, and have a docking "shell" for the iPhone that turns it into an Air, pretty much. That'll work too, and the Apple fans will love that.
Though I still don't see Apple switching to ARM for folks who need performance, like me. Until they can hit enthusiast level CPU performance at enthusiast level power consumption, I'll continue to be skeptical.
Have a look at the new iPad Pro and then we'll talk. I call bullshit on your post.I call bullshit on that.
Have a look at the new iPad Pro and then we'll talk. I call bullshit on your post.
Early word is that its single thread performance outpaces the hexacore i7 and multicore performance is getting a massive 90% bump in performance!
Let me guess, for 10 second before it starts to thermally throttle. So it has enough thermal dissipation to handle a benchmark, but good luck encoding a video on it. Add charging to the mix, and it will throttle even more due the heat from charging.
Geekbench, "full blown desktop Intel Core i5"...
Because we all know architecture's performance scales lineary with power.Give that A11 SoC a 30-95 watt TDP - I would say that would make a pretty good contender against modern quad/six-core Core i5/i7 CPUs.
Not sure why this is so "hilarious" to you, as we have all shared some very valuable and telling data from what we know so far - two years from now, we will know for sure either way.Because we all know architecture's performance scales lineary with power.
There are some hilarious dreams people in this thread have.
Once again, that was an iPhone 8 A11 ARM64 CPU with a 5 watt TDP compared to an Intel i5 x86-64 notebook CPU with a 28 watt TDP.Geekbench, "full blown desktop Intel Core i5"...
They will be when the physical limits of transistors have been reached, SMB scaling efficiency has diminished returns after so many cores, and x86-64 software efficiency has run its course.Intel and AMD must be incompetent as the might of Apple ARM is laughing at them on all fronts. They are just stunned.
Yeah, no, that was all back in the 1980s to 2000s - things have changed dramatically since then.Moto 68000-> Power PC->x86->ARM
Each time it has been extremely paintful for the mac community. I expect this to be no different. I realize this is towards their universal operating system platform. While a single platform for every computing device makes sense in terms of maintenance and app crossover, it's a pipe dream.
Something to consider: we're seeing containerization and cross-platform frameworks grow to a point that the OS and hardware is becoming irrelevant. Code like Java, Go, and Python- not to mention the web languages- can be compiled and run anywhere. More important is the consideration of platform- desktop, laptop with touch, tablet, phone, watch... fridge, car, IoT?
As the frameworks evolve, expect that to be tackled too.
Windows RT died because ARM (ARMv7) CPUs were not nearly as powerful or capable as they are now - at best the fastest units were similar to low-end Intel Atom CPUs, and it was 32-bit only.Even Microsoft is failing with the x86 to ARM emulators and slow performance, or we all would be running Windows ARM.
So because Steve Jobs isn't with Apple any more, they are now "dumb enough to move from x86-64 to ARM64" or something?It's obvious Steve Jobs has left the house.
Go back and read post 93 - in fact, it might be a good idea for you to re-read the thread as these comments are starting to repeat the same tired, and obsolete, ideas that no longer apply.It's synthetic tests and they aren't compared to desktop CPUs from Intel and AMD. There is no useful information to be extrapolated to real desktop performance and wattages and it is premature to proclaim (almost) parity, or even superior performance/watt, with current full desktop I5s.
Go back and read post 93 - in fact, it might be a good idea for you to re-read the thread as these comments are starting to repeat the same tired, and obsolete, ideas that no longer apply.
Those are hardly "synthetic" results and they are comparing them to server x86-64 CPUs, let alone desktop CPUs.
You are right, and that really is the bottom-line.As optimistic as I am about the performance of modern ARM implementations, it's still fair to say that a 1:1 test has not been performed- and it will be very hard to do one without relying on Apple.
Geekbench, SPEC, specific server workloads, Bulldozer and low clocked SkyLake. Am I missing anything? Please point me to a real world desktop comparison with current X86 desktop CPUs. Until then, low power ARM desktop beast is a pipe dream.Go back and read post 93 - in fact, it might be a good idea for you to re-read the thread as these comments are starting to repeat the same tired, and obsolete, ideas that no longer apply.
Those are hardly "synthetic" results and they are comparing them to server x86-64 CPUs, let alone desktop CPUs.
Yeah, no, that was all back in the 1980s to 2000s - things have changed dramatically since then.
Back then, Apple did not have any pre-existing software or OS foundations for the new architectures, unlike now.
Smartphones and tablets were a massive paradigm shift for the world and even more so for Apple.
I'm no Apple fanboy, but it is interesting to see people tout these same tired ideas that haven't held up for more than a decade now on multi-architecture support.
I know this thread is getting a bit long, but you might want to go back and read post 68:
Windows RT died because ARM (ARMv7) CPUs were not nearly as powerful or capable as they are now - at best the fastest units were similar to low-end Intel Atom CPUs, and it was 32-bit only.
ARM, now ARM64 (ARMv8.x) has come a long way since 2012 when Windows RT was introduced.
Again, if this were earlier this decade, I would fully agree with you.
Those ideas and concepts no longer apply at all and certainly won't in 2020.
So because Steve Jobs isn't with Apple any more, they are now "dumb enough to move from x86-64 to ARM64" or something?
You realize it was under Steve Jobs that the company moved from PowerPC to x86, and later x86-64, and then ARM for iOS in the mid-2000s, right?
So first you think it is "hilarious" when I said to give an Apple ARM64 CPU 30-95 watts, and now you are acting like we said a low power ARM desktop processor is going to be a beast... ok then. (trolling or just bad reading comprehension?)Geekbench, SPEC, specific server workloads, Bulldozer and low clocked SkyLake. Am I missing anything? Please point me to a real world desktop comparison with current X86 desktop CPUs. Until then, low power ARM desktop beast is a pipe dream.
I am aware of how difficult it is to emulate, and I doubt Apple will be emulating much of anything with their ARM64 CPUs in 2020.Take it from an expert, Hardware emulation from one platform to another is PAINFULLY slow.
Maybe you should read this:
https://www.tomshardware.com/news/windows-on-arm-too-expensive,37997.html
Now if everything ran on UWP (Universal Windows Platform) it would be a much better situation as pseudo code (intermediate language) is compiled at load time and optimized for the platform in particular. But Apple runs mostly native code still on the desktop. They really don't have an equivalent of a UWP. But even if they did have something like a UAP (Universal Apple Platform) you would need to convince developers to flock to it. This is the problem Microsoft is facing and failing at, even if UWP will run on iOS, Android, and Windows x86, Windows ARM from one code source. It seems like a win win. But developers don't want to learn another standard.
As to the PowerPC->x86, it was a necessity as PowerPC was no longer keeping up speed wise. Apple was getting hammered here. And one of the lures of paying more for an Apple is the fact you can boot a Windows OS and run it. If they go ARM, the Windows side of things will suffer massively and turn off more users, in addition to native apps that run in emulation mode.
You can't handle it. It's like your mind is blown right now. All those years on the Intel Kool-Aid. Not to worry, that will dissipate eventually. You will be fine in a few years. Probably running Apple hardware. But fine otherwise.Geekbench, "full blown desktop Intel Core i5"...
As others have stated here, they will most likely either have an x86-64 core in their SoC for software binaries that require it to avoid emulation, or they will have an x86-64-like ASIC in their SoC to perform the same functions.
No, it was hilarious to equate its current performance to a "full desktop I5" and then to extrapolate its future performance by giving it more TDP and scaling the performance lineary.So first you think it is "hilarious" when I said to give an Apple ARM64 CPU 30-95 watts, and now you are acting like we said a low power ARM desktop processor is going to be a beast... ok then. (trolling or just bad reading comprehension?)
A true desktop ARM64 CPU has not been released yet, and until it does and just as IdiotInCharge stated, a true 1:1 test will not be possible.
I will agree with you on this point, and it will be interesting to see how Apple deals with this.They can't do that due to licensing agreements on x86 instruction set. They would have to add an actual x86 chip from Intel or AMD. Yep that will do wonders for the price and compatibility won't it?
Not that I want to turn this thread into an OS war or Windows-bashing session, but Windows 10 is hardly optimized for ARM64 and is extremely bloated overall compared to iOS and OS X, both being UNIX-based operating systems.And my article was running full Windows 10, Not RT. And the performance was horrible even on the latest snap dragons. Getting software to switch over is a lot harder than you think or UWP/Windows store would be a lot bigger.
Apple also had dual-binaries in 2006 for both PowerPC and x86/x86-64, along with Rosetta, which seemed to work well enough to fill the gap until x86-64 fully succeeded PowerPC in their code binaries and compilations.
It's called speculation...No, it was hilarious to equate its current performance to a "full desktop I5" and then to extrapolate its future performance by giving it more TDP and scaling the performance lineary.
Yes, until it does you have no basis on which to claim it will be as powerful as X86 CPUs with a significantly lower power draw. So live by your own rules.
Where is the plant ? Is it in the way of the next Typhoon, Hurricane or Earthquake ?
If I was an investor, I’d wanna know
Oh, you may speculate all you want. Claiming is a different beast.It's called speculation...
How dare we use critical thinking with thoughts and ideas based on history and modern technological capabilities to speculate on potential future technologies.