Your app shouldn't suffer SSL's problems (2011)

(moxie.org)

44 points | by tgragnato 2193 days ago

2 comments

  • tialaramex 2193 days ago
    So, this is about six and a half years ago (December 2011)

    Distrust of the Web PKI (the Public Key Infrastructure for SSL/TLS on the Internet that basically falls out of what the Netscape Corporation built in the 1990s) was high and lots of people were sure better was possible. Moxie was one of them and so was I.

    So - what happened? Maybe somebody wrote a good article summarising, and will link it here, but otherwise here's a brief summary in two halves, plus a bonus

    1. Web PKI trustworthiness improvements. The CA/B Forum Baseline Requirements have been tightened up somewhat, so that the accepted minimum standard is higher than it was. Google's Certificate Transparency work ensure we (the public) can see what they're up to, rather than relying on rumours and guesswork, and so we can insist they do what they've said they would before disaster strikes. The Mozilla project's mozilla.dev.security.policy, open to the general public, began to take its oversight role more seriously.

    2. Alternatives proved trickier than anticipated. e.g. Pinning sounds great, and it works for Google, Apple, Microsoft. Most of the time. But it's a massive foot gun, and in some forms (HPKP) it's also a ransom mechanism; DANE leverages DNSSEC, but when you try to do that you find that a surprising number of users don't have working DNSSEC, indeed their DNS doesn't work properly even without DNSSEC; TOFU approaches which worked great for SSH in practice turn out not to suit the way ordinary users interact with the Web.

    Bonus: Some fraction of the people who expressed concerns like Moxie turned out to mainly, perhaps unconsciously, not want to pay $$$. Let's Encrypt almost incidentally (because it's automated and machines don't have wallets) is free at the point of use. It's surprising how many people who were absolutely certain they hated the Web PKI in principle came around to it once it was free...

    • tptacek 2193 days ago
      You're talking about the Web PKI. Moxie is not talking about the Web PKI; he's pointing out that mobile apps don't need to rely on the Web PKI, because installing an app already involves deploying a trust anchor (the app itself), and you might as well exploit that to bootstrap trust with your servers rather than using CA signatures.
      • tialaramex 2193 days ago
        "You might as well" implement your own secure transport protocol instead of SSL/TLS by the same thinking. With your own protocol implemented in your app there's no need to rely on whatever TLS implementation ships with the device.

        In both cases it's a trade, and my argument is that just as it would have been a mistake to roll your own transport security protocol in 2011, it's a mistake to make your own PKI in 2018. At least for any reason other than education.

        There doubtless were people in 2011 who could have come up with something better than TLS 1.2, and there doubtless are people in 2018 who can do a better job of managing a PKI than we've done of the Web PKI. But most people's secure transport protocol in 2011 would have made SSL 2.0 look good, and most people's attempt at running their own CA today is going to be OpenSSL on somebody's laptop.

        Part of the reason for my skepticism is interacting with private PKIs where they touch the edge of the Web PKI. Without exception they are badly managed. Whether it is the Payment Card Industry, the US Federal Government, or a company-specific setup that I end up debugging, it's pretty shocking. If Bad Guys aren't breaking these systems, it's because they aren't trying. And if they aren't trying, who cares what you do about security - anything would work.

        • tptacek 2193 days ago
          Most software developers, tasked to implement their own secure transport protocol, would fail to produce something as secure as even TLS 1.2, which is a flawed transport. And, of course, designing and implementing a secure transport is vastly harder than simply whitelisting certificates. So there's that, and there's the fact that HTTPS makes it through middleboxes that custom protocols don't. No, I don't think this is a meaningful comparison.
      • agwa 2193 days ago
        That's Option 1. In Option 2, Moxie suggests using the Web PKI, with pinning. I agree with tialaramex that this advice has not aged well. Most Web PKI experts would now advise against pinning and point to Certificate Transparency instead. Even Google, the pioneers of pinning that Moxie cites in his post, are moving away from pinning Google properties in Chrome.
        • tptacek 2193 days ago
          You're still talking about browsers. Moxie isn't: he's talking about mobile applications. There is little reason to rely on CAs, even with pinning, in a mobile application. You can simply make your own arrangements with your servers, since you control both ends of the connection. In that scenario, CAs add literally no value at all.
          • shawkinaw 2193 days ago
            Except that, IMO, I should be able to MITM the connection between an app on my phone and any server it’s talking to, for auditing purposes. In that case, CA/system trust-based systems let me do that relatively easily with a proxy. If the app sets up its secure connection with its own, uncustomizabe trust store, I can’t.
            • throwaway2048 2193 days ago
              I doubt many app makers (or users for that matter) are interested in allowing that.
            • slrz 2193 days ago
              I don't think that loading up a debug build with your dev certificates builtin is too much of a hassle for enabling MITM hooks.
              • Sophira 2192 days ago
                I believe the parent comment is talking about auditing apps as a user, not as a developer, in which case debug builds will not be available.
            • tptacek 2193 days ago
              Appsec firms routinely test iOS applications with pinned certificates.
    • bogomipz 2193 days ago
      >"in some forms (HPKP) it's also a ransom mechanism ..."

      Is this ransom as in "ransomware"? Can you explain how pinning and HPKP are a ransom mechanism?

      >"TOFU approaches which worked great for SSH in practice turn out not to suit the way ordinary users interact with the Web."

      How are they different exactly?

      • tialaramex 2193 days ago
        >"Can you explain how pinning and HPKP are a ransom mechanism?"

        HPKP exists today, but probably your site doesn't use it. Bad guys break in to your servers on Monday morning before work. They leave the site working exactly as it was before, except for two things: they replace the private key value used for the site, and they add HPKP headers requiring the corresponding public key (plus optionally others of their choosing). Nobody notices. Why would they?

        By Friday most of your big users have accessed the site, unknown to you or them, their web browsers are now irrevocably committed by HPKP to using one of the Bad Guy's chosen keys when talking to your site.

        On Friday evening, the Bad Guys remove their new private key, your site mysteriously goes down for most users. You have no idea why. You receive a telephone call from someone with a disguised voice. Transfer $1M worth of Bitcoins to the address listed on a hidden page on your web site, and access will be restored. Or don't, your choice, Bye.

        The dependency on this key has to be irrevocable otherwise it can't achieve its security goals, this is the whole idea of pinning. But because it's irrevocable you're screwed. Either you pay the ransom or your site is "bricked" indefinitely.

        >"How are they different exactly?"

        SSH users tend to visit a small number of systems, they achieve "First use" early, from a presumed friendly system and they usually have out-of-band contact (e.g. walk over to Dave and say "Hey! Dave, I saw this error! Did you change the keys?") when things go wrong. In contrast web users often first visit a site from some unfriendly environment (e.g. a coffee shop) and the only contact details they usually have for the site operator are on the site itself.

        • tptacek 2193 days ago
          That, again, is a browser problem. Mobile applications do not share it, since literally every user of the application is on an update channel controlled by the vendor.
      • tzahola 2193 days ago
        The issue with HPKP is that revocation is not possible if you don’t control the key that was pinned for your users. E.g. if an admin accidentally wipes the keystore, there’s no way of revoking the pinning info from the users’ browser. Taking this one step further, if an adversary enables HPKP on your behalf with a key that you don’t control, you’ll be at his mercy to keep your site operational.
        • bogomipz 2192 days ago
          Interesting unintended consequences. Thanks for the detailed explanations. Cheers.
      • steventhedev 2193 days ago
        Ransom in the sense that if you can hijack the DNS resolution for a website and set HPKP, you can then ransom the keys back to the legitimate owner. Or just destroy them and enjoy the damage. Imagine a domain squatter who grabs a domain, and sells it off, but set HPKP with a really long expiration. It needs to be strict to provide any security benefit, but that same strictness inherently allows it to be weaponized to deny legitimate access. UX and policy in the future will become more nuanced to mitigate the potential damage.
  • gcb0 2193 days ago
    this ignores user trust.

    now the closed source application is impervious to mim attacks or bad CA operators but the user also have no way to validate the traffic comming out of their very own device.

    generic is good sometimes. and the generic solution already account for generic user/app local certificates just fine. no need to hide an extra one that can't be verified/update inside the app. that's kinda of an asshole move.

    • tptacek 2193 days ago
      This doesn't make sense. Do you have the private key for Facebook's Digicert-signed certificate? No, you don't. So what difference does it make whether you get certificate trust from Digicert or from an anchor installed along with your mobile app?
      • saurik 2193 days ago
        ... because you essentially are always allowed to install your own custom root certificate when people use the system SSL support, even on iOS? (FWIW, I also feel like the implied solution here is wrong, but your analysis is worse.)
        • tptacek 2192 days ago
          What difference does that make? It's also annoying to test applications that don't honor system proxy settings. And?