What's the argument for why fat binaries like this are a good idea, as opposed to having separate binaries that can be downloaded for different architectures? It seems like the main advantage would be when there is (something like) a networked drive simultaneously accessed by incompatible machines, with a secondary case for when the same drive is transferred from machine to machine (or when a single machine is upgraded to a different processor). Is this a large enough use case to drive adoption, or is there some other major benefit that I'm missing? Or is it that having 2x (or 17x) larger binaries to distribute isn't actually much of a downside, so even a small benefit is sufficient justification?
Most apps, 2x binary size is no big deal. This has been true since the 1990s, when we had fat binaries that were M68K and PowerPC. Nobody’s doing 17x, that’s just a joke.
Figuring out how to get users to download the correct binary, now that sucks. Do you add JavaScript to your download page or do server-side sniffing with the user agent to deliver the correct binary? And do you want to explain to someone that yes, they downloaded the Mac version but they have an x86 Mac and can’t download the ARM binary?
I would go further than "no big deal"—it's a complete nonissue. You could have 5 architectures and it would still be a nonissue, because for most graphical applications, 99% of the size is taken up by graphics and similar assets.
The copy of TextEdit on my Mac is 6.8 MB, but the actual binary is only 167 KB. Even if you were to go nuts and compile a full-on 17-architecture version of the TextEdit binary, it would still constitute less than half the size of the app, and this is for TextEdit, an app with exceedingly few graphics.
And, we're talking exclusively about hard disk space—it's not as though the foreign architectures are getting loaded into memory and taking up space there. I'm all about sweating the details when it comes to optimization and performance, but even on a smaller SSD, a couple extra megabytes per app just isn't going to have a real impact.
Fun fact: many of the binaries in Snow Leopard are still PPC-Intel universal binaries, even through 10.6.8. The extra space used was so minimal that Apple didn't bother to fully strip out the PPC portions until Lion. (Early versions of Snow Leopard can even be booted on PPC by replacing certain files: https://forums.macrumors.com/threads/snow-leopard-on-unsuppo...)
>And do you want to explain to someone that yes, they downloaded the Mac version but they have an x86 Mac and can’t download the ARM binary?
Can't you detect the system architecture through the user agent string? eg. https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Us.... I guess it's still an issue if you're downloading for ARM mac, using a x86 mac, but that's a pretty rare occurrence and you can work around it by allowing users to select the architecture manually.
The only advantage that user agent sniffing has over fat binaries is the decreased size, and usually, it’s a fairly small difference. Small benefit, big drawback. Plus, there are all the smaller developers or hobbyists who just want to share the app they made, and maybe don’t want to figure out how to get user agent sniffing working on their site—or maybe they are just sharing a link to Google Drive with the game that they made.
Also from my experience, people still share downloaded copies of applications.
We can use thin binaries only when necessary—web browsers, for example.
User confusion is a real thing. First stage of this is when presenting a download - how does an average user know which of the 3 or 4 binaries you offer to download?
Then a year later the user gets a new computer and migrates everything to it using Migration Assistant. The platform specific binary he downloaded from your website no longer runs, and he assumes it’s because it’s broken, or the new computer has a new OS, or he has to buy a new copy for the new computer.
Another potential problem would lie in a user copying an app to a second computer with a different architecture.
Lots of reasons to distribute fat binaries if you’re not in the App Store (n which case the OS can/will auto download the correct version upon migration).
I'm not a macOS user, but aren't macOS applications installed by just copying the bundle (which appears to the user as if it was a single file, instead of a directory)? Making the same application bundle work on both architectures means that a user who upgrades from an Intel macOS to an ARM macOS can just drag-and-drop the applications from the old system to the new system, and they will work.
Most applications are distributed that way. Some have more complicated, Windows-style installers (PKG files) because they need to do things like install kernel extensions or what have you before they'll work.
> It seems like the main advantage would be when there is (something like) a networked drive simultaneously accessed by incompatible machines
Your thinking is way too complex. Apple's #1 resource is smooth UX and having different versions of an app that you need background knowledge about your computer to choose is very not smooth.
If you download a Windows application e.g. https://www.textpad.com/download you often have to decide whether to download the 32-bit or 64-bit version. Not a big problem for any of us HN users, for sure. But it's not a great interface e.g. for my dad, who just bought a laptop, which "runs Windows", he doesn't know whether it's the 32-bit or 64-bit version.
Might not be so relevant today for Windows, as I guess it's been a while since 32-bit Windows was popular. But the point stands, that during times when there are multiple versions in popular use, as will soon be the case for the Mac platform, asking the user to choose which they have is confusing, as they might not know.
Being able to download a single binary and it "just works" is exactly the user-interface for a download page that you want.
For most applications the machine code is not the bulk of the size, that’s assets like images. These aren’t duplicated in a ‘fat’ binary so the size increase is rather modest.
The advantage is convenience for the user (and developers), one download that works transparently for all computers.
It's an enormous advantage in one specific case, where you have enough non-technical users who would otherwise need to manually choose a binary.
That was a big deal for Apple during their two previous transitions, but not as much these days where as I understand it a significant portion of Mac software distribution happens through the official App Store platform.
We see the same in the Linux world, where fat binary support has been developed and proposed on multiple occasions but has never really caught on because the number of users who don't know their CPU architecture but want to install software from outside their distro's package manager is effectively negligible.
Universal Binaries were barely an issue fifteen years ago, either. Sure, they were a bit larger but only the actual compiled code changed. Any assets used were the same and the difference in package size wasn't particularly significant.
Yes, having a user manually choose a binary to download from a web browser is poor, but certainly there are technological solutions to this that are lighter weight than doing a full download of a fat binary for everyone. Perhaps you download a shim installer that tests your processor and chooses the correct download? Perhaps instead of a generic web browser downloads are done within a framework that already knows the correct architecture? Especially if you are Apple, and if the goal is simply choosing the right version initially, it seems like there would be many better solutions.
> Yes, having a user manually choose a binary to download from a web browser is poor, but certainly there are technological solutions to this that are lighter weight than doing a full download of a fat binary for everyone.
In the average application, the binary is only a minor contributor to bundle size, assets are the vast majority of it.
For instance on my system Delicious Library 3 is a 110MB bundle. The binary is under 4MB. Same observation with Keka (30MB / 344K), xACT (15MB / 960K) or unicode checker (10MB / 500K)
Fat binaries were not an issue during the last transition, they're even less so now.
> Perhaps you download a shim installer that tests your processor and chooses the correct download?
Wouldn't that only work if the shim itself was a universal/fat binary? Otherwise you still run into the same problem. You can sort of work around on unix-like systems by using the shebang, and on windows by having the executable in .net.
Sure, it could be a universal binary, or something interpreted, or bytecode using an existing VM. But the shim would be something small, and could probably be reused as a standard installer for many programs. My argument wouldn't be that a universal binary is never the right approach, but that it seems wasteful to make every binary support multiple architectures instead of using a more targeted approach.
This kind of thing is pretty cool, but I’ve learned to avoid them, because they can result in A) BIG clumps of code, and B) testing issues.
I used to have a special shell script that I wrote, that used lipo to combine multiple binaries into an aggregate library. I probably still have it around, somewhere. I was fairly proud of it.
Also, I had to use static libs, which wasn’t a problem, as long as I didn’t mind a big executable that couldn’t be trimmed.
Over the years, I’ve learned (the hard way, since I’m a stubborn knucklehead) not to “end run” around Apple’s prescribed methodology.
Now that Apple has wrapped Swift Package Manager into Xcode, it lessens the need for “clumped” binaries (for internal projects).
Ah yes, Java, with the promise of "write once, run anywhere"...as long as anywhere has a specific version of the JRE installed that's been EoL since 2007.
Obviously that's not the case for all Java software, but has always amazed me how seemingly easy Java made it to end up like that. So many legacy Java apps that are a complete pain to support, and $deity help you if one user needs to run two different legacy apps with non-overlapping JRE requirements.
And that's not even getting in to the horrors that were the Java plugin, fortunately that is entirely out of my life and gone for good.
Agree and I am very glad I don't have to deal with this type of thing any more.
Sane support for different JREs is one of the use cases where containerization would really show its worth I expect. It should be relatively easy to set up a container with the app, the right jre version and a consistent set of dependency jars, all wrapped up with a shellscript that sets the class path and adds any horrible monkey patches you need to get things working and then launches it with the right set of arguments etc. Once that's done all that nastiness would be kept in its own private hell and you could have multiple of them running simultaneously and know that they won't infect each other and upgrading any one piece and/or your system jdk isn't going to kill everything.
It's not dependent on the system or user classpath or other environment variables, so should be reproducible and avoid weird "it works for me" bugs where the user is suffering something you just don't see.
Safe to say that's far from what "write once, run everywhere" seems to promise.
as long as anywhere has a specific version of the JRE installed that's been EoL since 2007.
I just ran into this on a new machine while trying to get Bamboo (Atlassian's CI product) running. Just download latest? phhhttt It needs a specific version: 1.8. Wait a minute, WTF? Java is at like v14 now, or something, right? Ah, because apparently there's some soft link from 1.8 to $WHATEVER_VERSION. Didn't matter because I downloaded the JRE, but needed the JDK. Or something. I've eventually got it working, and wrote it down this time.
I go back so far that I've written software on punch cards. And someone suggests that we foist this on your average Mac (or for that matter, any OS) user? Fat binaries, please.
> It needs a specific version: 1.8. Wait a minute, WTF? Java is at like v14 now, or something, right?
There was a compatibility break between Java 1.8 and Java 9 (the version number also lost the prefix at the same time, Java 1.8 can be thought of as "Java 8"). It was not as big as the change from Python 2 to Python 3, but a lot of software needed changes to not break; as late as last year, I was still seeing Java libraries adding fixes for compatibility with Java 9. If your software depends on libraries which have not yet been upgraded to work with Java 9, you have to require Java 8 (that is, Java 1.8).
It doesn't help that, even today, the default version of Java in some major Linux distributions is Java 1.8, so not only that's a good reason to keep compatibility with Java 8, but also it means that there's a good chance that the server already has Java 1.8 installed. (In a not so distant future, however, these major Linux distributions will probably have Java 11 as the default version; the next long-term stable Java version after Java 11 is probably going to be Java 17, so you are going to see many software require Java 11 for a while, and later Java 17.)
Thank you for a helpful explanation from a software engineer, but one who is not steeped in Java all day. It is annoyingly opaque to those of us for whom Java is just a tool.
I was serious, but of course immediately received negative feedback. Super Duper, it is then. Any chance someone will explain why they feel that is the superior choice? It doesn't seem to be any more precise. If anything it is just some unrelated expression used for kicks. Or was this just an emotional reaction unrelated to any particular line of reasoning?
I downvoted it because I thought it was a low effort joke, and I wanted the comments to focus on the concept rather than the name. Also, I thought you were suggesting "totally gross binary" as a general replacement for the established term "fat binary", and it didn't seem like a good idea. I treated Super Duper as just a sarcastic title for a blog post, and don't think it's actually any better. Apologies if I misunderstood your intent.
Respectfully, I don't think it added anything to the discussion regardless of whether or not you were serious. Why is a multi-arch binary "totally gross"? The advantages and disadvantages have already been discussed heavily upthread, and they did it in a much more substantive way.