> “To be clear, this is not a vulnerability or bug in Apple’s code... basically just unclear/confusing documentation that led to people using their API incorrectly,” Wardle told Ars. “Apple updated [its] documents to be more clear, and third-party developers just have to invoke the API with a more comprehensive flag (that was always available).”
Ultimately you have to opt-in to doing any checks in the first place, no matter the API. So does that make every API insecure, since you could always just "return true" at the bottom of your authentication function?
To put it differently: Who's to say whether they were using the checks wrong, or just doing the wrong checks?
The API provided an easy way to check a binary file for a signature, and you just have to open the bundle, grab the binary, and pass it to the API. Oh, if there’s more than one binary in the bundle (there almost never is these days), then you should make sure to check each binary.
“Almost never is” except when an attacker knows you’re counting on this fact, and poisons a good bundle with a bad binary that your code skips over, but macOS doesn’t when it executes the binary.
Actually it's not THAT uncommon for fat binaries to contain multiple architectures. True, on today's macOS (and especially in the fall when 32bit support will be deprecated in 10.14), it looks like x86_64 rules supreme, but for a long while it was common to have combined i386 and x86_64 fat binaries, and before that, combined ppc and i386 binaries. Finally, it's not far-fetched to believe that in the medium near future, we'll need fat binaries with x86_64 and aarch64.
Edit: actually, even in today's mostly-x86_64-only world, there are fat binaries in macOS, because there is a separate "x86_64h" architecture for "haswell and better". So even in a pure 64bit intel world, there's going to be fat binaries for a while. For example, "file /usr/lib/libobjc.dylib" shows three slices on macOS 10.13:
On iOS, having a non-fat binary is almost the exception to the rule. For the longest time, it was common to have both armv6 and armv7 slices, and these days, armv7 and aarch64 slices. Granted, with iOS11 dropping armv and apps starting to drop iOS10 support, we'll have a run with non-fat aarch64 binaries for a while. This is quite visible for compile times and compile errors during development! Also, for iOS, there's bitcode and app thinning which does mean end user devices are often served a single slice non-fat binary anyways.
Vendors of closed source iOS libraries, such as the "Google maps for iOS" SDK, often ship fat binaries for the .dylibs containing both armv7, aarch64, i386 and/or x86_64. Why are Intel slices for iOS a thing? To be able to run your app and the library in the Xcode iOS simulator, which actually runs x86 code only. That's why it's not called an "emulator".
The history of fat binaries in macOS goes all the way back to NeXTSTEP (of course, since macOS is basically a modern NeXTSTEP, with NSObject still showing off the legacy behind the curtain to new iOS developers) where even m68k was a common slice. https://en.wikipedia.org/wiki/Fat_binary#NeXTSTEP_Multi-Arch... which at times even exploded to "Quad-fat binaries" containing slices for m68k, i386, pa-risc and sparc all together in one executable.
MacOS will execute anything you ask it to. The normal ways for running programs (like double clicking on them in Finder) causes a (presumably non-buggy) code signature check to run on them, but there are certainly ways of executing programs that bypass this user-facing warning (like running ./Bundle.app/Contents/MacOS/Bundle in a terminal).
The bug described in the article says that some third-party code signature validation methods were flawed and didn't properly detect unsigned code that the third-party programs would then execute.
If the application code is responsible for checking the signatures, what stops the attacker to just ship an older, vulnerable application together with his bad binary? I would dare to say that macOS should do the validation before executing any binary.
The bug isn't in mac OS's signature validation though, which works fine. The bug lies in the additional validation performed by certain third-party security products. It has nothing to do with any of Apple's code whatsoever. Although as another commenter has pointed out, Apple could maybe have designed the API better to avoid it being used wrongly.
Headline is very misleading. The article is just describing a bug which happened to be present in several different 3rd party security apps, not necessarily every 3rd party app. The article also notes that no tool built into mac OS ever had the issue.