I know Apple has gone through these transitions before (68K -> PWR -> Intel). But if there is a move to ARM this is the first time they have moved away from a currently dominant ecosystem (yes, to another popular one in this case; but the earlier moves were all abandon ship style transitions). I don't see a lot of motivation for pre-compiled open source projects (such as R/CRAN) to maintain two copies of their software (one Apple AMD64 and one Apple ARM64) during the transition. Yet another thing that is going to make Macs less convenient for research and development.
I would expect that Apple would bring back the Rosetta component for such cases (i.e. the same interface to the OS, but now a component doing Intel->ARM dynamic binary translation rather than PWR->Intel dynamic binary translation.)
Actually, maybe dymamic binary translation wouldn't be needed all that often. A new Rosetta would likely also be helped out quite a bit by the fact that modern macOS binaries contain embedded copies of their their LLVM intermediate representation; you could transpile that IR (by patching Intel-targeted intrinsics with ARM polyfills) and then run it through the LLVM optimizer again, and then cache the result. That's a fully static binary translation, and the resulting binaries would probably be pretty fast.
(You'd still need dynamic binary translation for anything with its own JIT, but maybe there are few enough programs with their own JITs—and all such programs are "big" enough in terms of development resources—that Apple can require these to ship an ARM target. In which case, maybe they don't need to build a new dynamic binary translation component at all!)
The translation stuff may be okay for user code (and I know Apple has done it before). But for scientific code it is just too much cost and too many things to break. I know scientists are not Apple's target market, but this just makes getting a Linux laptop a much better proposition.
> I don't see a lot of motivation for pre-compiled open source projects (such as R/CRAN) to maintain two copies of their software (one Apple AMD64 and one Apple ARM64) during the transition. Yet another thing that is going to make Macs less convenient for research and development.
You might want to go look at Apple's history at how they managed to pull off two of the largest CPU transitions in history almost effortlessly. It's called fat binaries or in Apple's world, universal binary (https://en.wikipedia.org/wiki/Universal_binary).
You don't need to ship two separate copies, you just include a target and Xcode will give you a single copy that works on two separate platforms.
In addition, Microsoft is also going through the same phase with their own ARM branch of Windows 10.
On one hand it's cool to have more effort into desktop CPUs (ie high single thread perf) that are not x86.
On the other hand, although the ISA is public, I'm guessing the whole boot-process & platform will become more and more closed as apple moves towards custom chips all the way down.
Is there any chance you can still run linux on mac hardware in 5 years ?
I'm guessing the whole boot-process & platform will become more and more closed as apple moves towards custom chips all the way down.
You make a very good point. The dominance of x86 is largely because it forms the long-standing open PC platform, which while Apple and others have been trying to remove "legacy" features from and lock down for a long time, still remains quite well-documented and open in comparison to smartphone/tablet SoCs that have next to no detailed public documentation at all.
Yep. I see this too. Mac app store apps have been dual architecture for a long time. 32 bit and 64 bit. Adding ARM as another compilation target will take some work but is doable.
Maybe they go to three architectures in the app store, or maybe they only support ARM after 32 bit is dead. Which works out nicely with the recent deprecation of 32 bit apps.
Since the iOS simulator is an x86 target it might not need any work. Adding a compiler target for ARM OS X may just be a matter of allowing the configuration.
Also, given the Intel OS X project was cooking long before they announced the transition, it’s likely safe to believe that full blown OS X on ARM is already running somewhere in a lab at Apple.
32bit is effectively dead... only need would be to support x64, and ARM. Which could be okay, so long as their cross-compilation works from pro. My biggest fear is they'll abandon Intel architecture altogether, which will be pretty bad for those of us relying on hackintosh desktops today.
No idea where all the creative types will go if that happens. Apple could indeed be shooting themselves in the foot on this one.
What is day one of ARM-based MacBook ownership going to be like for a software developer?
Do we open Terminal and go to `brew install` a bunch of development tools... and then it hits us... this machine is cool and all, but half the stuff we need to get work done does not build or run on arm64?
Homebrew lead maintainer here: if Apple release officially supported ARM hardware with Xcode: we will be trying to make Homebrew work well on it as soon as possible.
Surely homebrew will just extend to support building for ARM. For some packages that will “just work”, others will require some upgrades, but that’s already the case for OS upgrades now. Homebrew operate build servers to keep cached builds, I’m sure they can do the same for both ARM and x86.
It would be a big job, but much of the software on Homebrew is already ARM compatible, and I think there’s a clear path to supporting that, which should make the migration relatively seamless for most developers I’d imagine.
It won’t be without issue, but I don’t think it will hurt macOS as a development OS much in the long run.
I may not be a device for Software Developers on day one. I guess Xcode will obviously work, and the default tools such as Python or Ruby. But for everything else it will take porting time, and that's fine because the target market will be casual users looking for a sub-799 Macbook.
Apple has done this before for PowerPC software on Intel hardware and Microsoft is doing it right now for Intel software on ARM hardware: Compatibility modes (or how you'd call it - they enable the software to run without modification, albeit at lowered performance).
When Apple did the PowerPC -> Intel switch, they were going to a higher performing CPU, so the efficiency loss in having to emulate was somewhat offset by the higher performance processor. In this case, they will most likely be taking a step down in performance, compounding the loss of efficiency in having to emulate.
Yet Apple has enough ecosystem power to force most developers to switch quickly. And if they don't the worst customer complaint will be the app running slowly. But I'd expect them to release a tool chain months before they start selling an ARM mac.
Plus, the Mac App Store makes this a lot easier this time round. Apple will have a big "optimized for Air" sticker on the App. 99% of common apps will switch in no time.
I think the dev tooling will migrate fine... I think the creative tools for video editing/encoding etc will suffer to a level that cannot really be understated.
Maybe Adobe will finally take Linux seriously if that happens.
Brew being a set of ports as opposed to binary compatible is going to hurt Apple developers here. To compare, at Build 2018 I played with Windows Subsystem for Linux on an ARM64 Windows laptop. WSL being binary compatible meant all of the Ubuntu (or Debian, or …) packages for ARM64 were available.
Ports as in they are not using upstream binaries. I think I'm using the term ports correctly - like FreeBSD ports? That is, they may use the upstream source code (or not - I doubt that's true of all Homebrew packages) but they are not the native packages from an upstream *nix distribution.
I may have also spoken imprecisely. Ultimately what I mean is this: I can copy a binary or a package compiled for Ubuntu, Debian, RHEL, what-have-you, targeting the Linux kernel, and I can run or install that in Windows Subsystem for Linux as is.
Here is the sources.list from Ubuntu 18.04 on Windows, with comments removed:
deb http://archive.ubuntu.com/ubuntu/ bionic main restricted
deb http://archive.ubuntu.com/ubuntu/ bionic-updates main restricted
deb http://archive.ubuntu.com/ubuntu/ bionic universe
deb http://archive.ubuntu.com/ubuntu/ bionic-updates universe
deb http://archive.ubuntu.com/ubuntu/ bionic multiverse
deb http://archive.ubuntu.com/ubuntu/ bionic-updates multiverse
deb http://archive.ubuntu.com/ubuntu/ bionic-backports main restricted universe multiverse
deb http://security.ubuntu.com/ubuntu/ bionic-security main restricted
deb http://security.ubuntu.com/ubuntu/ bionic-security universe
deb http://security.ubuntu.com/ubuntu/ bionic-security multiverse
I get the feeling since the whole 'i' thing is iOS that iBook might be an iOS portable and we'll get some other name for the Mac. An iOS portable would compete fairly well in schools where Chromebooks are winning and iPads are not.
I can see low cost ARM based MacBook (savings from not paying Intel tax) marketed as long battery high end hardware replacement for ChromeBooks. It can still run modified OS X which allows for Mac App Store apps (MS Office just announced Mac App Store support), browser, email etc. Throw in native iTunes/iPhoto with cloud storage and you got a pretty nice offering.
The writing has been on the wall for ages with this one.
Intel's CPU limitations are at the heart of most of the complaints about Apple's computer line (power vs performance tradeoffs).
Apple will always choose thin (poor heat dissipation) over fast.
Is anyone doing OSX development? Is Apple collecting the LLVM intermediate code for OSX apps like they do for iOS? Its been a few years since I've done OSX/iOS dev, I could be off base. But it has often occurred to me they could use the intermediate code to retarget apps to different
I dev on macOS. Mostly node and related though. Almost everything I use will be unaffected... Video/Photo creative types are likely to see an exodus if it happens though.
I think this would be a great time for Adobe to partner with Canonical as a supported platform for Creative Suite. With improved funding and work towards getting NVidia and AMD support top of line.
Not sure I see the relevance. Mac OS US installed base is about 46 million, Linux is about 5.8 million. (Based on browsing data for share of total installed base of about 350 million in the US.)
The actual numbers can be sliced and diced different ways of course, but by any accounting the Mac is small compared to Windows and Linux is much smaller than that. I just don't see the economic motivation for companies like Adobe, particularly when you add in the diversity of Linux versions that would have to be supported. Whether or not Adobe can generate working code on a platform is only a small part of the cost of releasing on that platform. Don't hold your breath...
I don't know for certain, but I'd venture to guess half of Adobe's users are on Macs, and that an ARM based architecture won't work well for creative works (heavy image manipulation, video editing and encoding). And that could lead to a lot of Mac creative types looking for somewhere to go.
It could be an opportunity for Adobe, if they played their cards right.
They'll do a Rosetta + fat binaries and use ARM's hardware virtualization features to run a lightweight Hypervisor to run these apps on top of ARM CPU almost faster than any emulation technologies. There's a reason Apple added Hypervisor to Lion, it wasn't just for supporting VMs via Mac App Store but I suspect to virtualize legacy apps on top of fast ARM chips.