Mozilla HTTP Observatory

(observatory.mozilla.org)

185 points | by chynkm 948 days ago

13 comments

  • marginalia_nu 948 days ago
    I have really mixed feelings about deprecating HTTP for HTTPS. There are a lot of websites that are never going to migrate, websites with quality content. There's also a lot of clients that are never going to support HTTPS. There's nothing wrong with them otherwise, the hardware is still good, but they can't be used anymore and it's not the owners choice to decide, but a few big tech companies pushing this change.

    Even if we do care about these nefarious men in the middle, the elephant in the room is that a large part of the encrypted traffic these days go through major CDNs, and for them to actually pages and and route requests and be anything more than a glorified NAT, they need to inspect what's being sent, and keep track of who is sending it. Even if they totally pinky swear they aren't doing anything nefarious with their position of being able to inspect and analyze a huge portion of the Internet's traffic, and even if we believe them, that can change.

    Remember SourceForge? Remember when they were the heroes of open source, the go-to host for source code and binaries? Remember when they were bought up and subsequently were caught with their fingers in the cookie jar bundling malware in said open source packages?

    All I'm saying is that there sure is a lot of eggs in that basket. Is a lot easier to lean on one or two CDN operators than it is to lean on every ISP in the country.

    • xg15 948 days ago
      I think it depends on the actual threat model.

      It's a bit weird that HTTPS-everywhere was initiated in the aftermath of the Snowden leaks. If your enemy is the CIA or NSA etc, HTTPS probably won't help you: They can compromise your CDN or simply go for your hoster directly.

      However, there are many smaller and more common threats against HTTPS provides significant defense that wasn't there before. In particular manipulative ISPs or public WiFis or third parties who are logged into the same WiFi as you and are capturing traffic. (Remember Firesheep etc)

      That being said, I also believe there are some significant downsides to the HTTPS push (and also DoH, eSNI and friends) that browser vendors refuse to address. In particular:

      - It's a huge driver of "platformisation" of the web: To access an increasing amount of browser features, you now have to rely on third party services - at the very least, you need a domain. This is particularly problematic for hobbyist projects, local-only networks and home servers.

      - It normalizes a situation where network traffic is treated as a black box. It's a good thing if my ISP or the cafe I'm working in can't inspect my packets - it's highly problematic if my company's admin has no idea what is going on inside the network or if I can't even inspect traffic from my own devices. I'm seriously wondering who is protecting against what here. This seems less for protecting users against snoopers and more for protecting tech companies against their own users - which is really the opposite of privacy.

      • marginalia_nu 948 days ago
        > - It normalizes a situation where network traffic is treated as a black box. It's a good thing if my ISP or the cafe I'm working in can't inspect my packets - it's highly problematic if my company's admin has no idea what is going on inside the network or if I can't even inspect traffic from my own devices. I'm seriously wondering who is protecting against what here. This seems less for protecting users against snoopers and more for protecting tech companies against their own users - which is really the opposite of privacy.

        Yeah, this is... troublesome, especially how fond software developers seem to be of telemetry these days. For software you haven't compiled yourself, that's completely opaque data leaving your network.

        Especially in the light of how incredibly difficult it is to disable that telemetry collection in Firefox, it's not a great look for mozilla.

        I don't know if this is the authoritative guide du jour, but the steps usually go something like this: https://gist.github.com/gamunu/7fbee4e2318fdc080395a7f74cc34...

        I'm not sure this trust we're effectively being coerced to put in software and platforms is great from a privacy standpoint. But as you said, there has not been a lot of talk about this.

        • marginalia_nu 947 days ago
          I just keep thinking if the something like the BonziBuddy scandal had been pulled off today, how would anyone ever find out?

          Well, actually, we have Siri, Alexa, Cortana and Google Assistant now, and the pitch is eerily familiar in many ways.

    • thayne 948 days ago
      > Is a lot easier to lean on one or two CDN operators than it is to lean on every ISP in the country.

      First of all, that's ignoring how easy it is to MiTM on a public wifi network.

      Secondly, if a CDN starts misbehaving, customers can switch to a different one. For many people in the US at least if an ISP is misbehaving, there may not be any other option to switch to.

      • marginalia_nu 948 days ago
        It's a matter of scale. You can perform opportunistic small scale mitm attacks on wifi. Maybe that is a concern for interactive websites. But servers that only host static content, why do they need encryption?

        And how would you find out if a cdn was misbehaving, especially in the context of gag orders? And even if we did find out, it's out in the open that Facebook and Google is doing all this really invasive tracking, yet nobody seems to be leaving them in any hurry.

    • trustonfirstuse 948 days ago
      I would like to see browsers support self-signed certificates as first-class citizens.

      This would probably mean supporting both certificate observatories and trust-on-first-use, at the users discretion.

    • chrisan 948 days ago
      > There's also a lot of clients that are never going to support HTTPS

      What kinds of clients dont support HTTPS?

      • marginalia_nu 948 days ago
        HTTPS as a standard has pretty wide support, but the versions of TLS that is increasingly mandatory has much more limited support in older software.

        There is also a growing problem with stale root certs that are not always possible to update.

    • 2Gkashmiri 948 days ago
      i do not understand what is the drama about 100% https adoption? if i am watching videos or reading comics or reading recipes or surfing old books and stuff, why the hell should i be concerned about "privacy" and "mitm" and "end to end" ? i would be personally happy if my payments are secure enough that i pay and the seller should acknowledge payment received. thats it. thats the whole "security" i want from the internet. Now, if there is someone who wants to impersonate my news website and supply me a falsified news article about harmless kittens, maybe i should be concerned about why i am being a target instead of protecting that i be able to securely read my harmless kittens news article?
      • PeterisP 948 days ago
        Random examples of attacks through a non-https site with kitten articles (assuming an attacker on a shared wifi doing MITM).

        1. inject javascript on the kitten site that, in case if you move on to a different tab for five minutes, redirect the page to a spoofed version of the login page of your payment provider or email account usable to recover other credentials, so that if you go back to that tab you might think that your session there just expired and "re-login".

        2. inject javascript on the kitten site that replaces any legitimate links to social media platforms with links to spoofed login pages for these platforms.

        3. inject javascript on the kitten site that adds an overlay over the kitten videos where clicking on it prompts you to download video_player_plugin_totally_legit.exe in order to watch the kitten videos.

        4. inject javascript that replaces any ads (or just adds them) with attacker-controlled ones; possibly for some targeted scams.

        5. inject javascript that runs cryptomining in the background. The attacker earns some pennies, but your computer temperature and battery life starts to suck.

        6. perhaps they get to inject a javascript sandbox escape - those are rare and not actively exploited at the moment but if your browser isn't up to date for some reason, then sooner or later another exploitable vulnerability will appear again in the wild.

        In short, any http-only site is effectively potentially fully attacker-controlled for malware delivery, as much as websites can be used for malware delivery.

        • 2Gkashmiri 947 days ago
          would ublock origin help in this case? not talking about noscript
          • PeterisP 947 days ago
            As far as I understand, ublock origin is based on blocking specific things, so anything previously unseen (e.g. custom malware) would be left untouched by it.
        • a1369209993 948 days ago
          > Random examples of attacks through a non-https site with kitten articles

          None of those attacks work unless you whitelist the site to run javascript, and if you are dumb enough to do that, those are all attacks you should be worried about the site itself deploying even if it is on https.

      • daveoc64 948 days ago
        People often don't think about the transition from a "public" page to a "private" page.

        Take Hacker News for example. If you go to the home page without logging in, you'll see a list of posts. Does this need HTTPS?

        Using your logic, it wouldn't.

        But there's a login link in the top right corner of the page. If someone were to MITM the home page, they could make this login link go elsewhere, and harvest data.

        The same is true of any site which has links to login/signup/payments/contact forms etc.

        That describes virtually every site these days.

        It's much easier to have every part of the site on HTTPS, than to have to worry about this.

        Plus you avoid the possibility of malicious ads, scripts, and malware being injected into the page.

        • 2Gkashmiri 947 days ago
          agreed for the most part. does this MITM happen on the local network level or between the ISP and device or upstream level? genuinely interested in figuring this out
  • weinzierl 948 days ago
    It is an excellent tool. For a bit more background I found [1] (from 2016) quite insightful.

    In addition here are a few notes that I collected using it, no criticism - just (hopefully) constructive feedback:

    - The SSH (H not L) Observatory part seems to be broken for a long time (months at least). Not exactly sure what it was supposed to do anyway and how useful it would have been.

    - I find the nomenclature for the Compatibility Level a bit unfortunate. As far as I understand, the highly secure and practically useful configurations recommended by Mozilla and elsewhere, all end up classified as Intermediate. The more desirable sounding Modern seems to be unachievable for any real world site. I'd love to see counterexamples if I'm wrong.

    - It seems not to be very actively maintained since its main (and original?) author April King left Mozilla. About a half a year ago I filed an issue where the Observatory scan hung forever for certain sites [2], but apparently no one ever looked at it. (Maybe it is not an issue with the Observatory, but I think I wrote a decent report and hoped for some feedback).

    [1] https://pokeinthe.io/2016/08/25/observatory-by-mozilla-a-new...

    [2] https://github.com/mozilla/http-observatory-website/issues/2...

    • ff317 948 days ago
      > The more desirable sounding Modern seems to be unachievable for any real world site. I'd love to see counterexamples if I'm wrong.

      I'd agree that the description of "M" in https://wiki.mozilla.org/Security/Server_Side_TLS#Recommende... is unrealistic for a site with a large and diverse audience, so far. The primary issue is that it requires turning off TLSv1.2.

      Wikipedia is a good example of "about as Modern as you can get in the real world with a big global audience". It's a little stricter than "Intermediate", but doesn't quite meet the "Modern" description. The key items there are that Wikipedia still supports TLSv1.2 (but only with TLSv1.3-like ciphers) and it still supports dual cert compatibility (ECDSA+RSA). The RSA support is likely to be on chopping block Soon, as the only real use-case for RSA in this config is to support ancient installs of the last-known-good version of Chrome (49) on WinXP SP3, but Wikipedia will likely have to continue supporting TLSv1.2 for quite some time.

      In any case, though, Wikipedia still gets an "M" rating in the check, so either the description is wrong or the check is buggy:

      https://observatory.mozilla.org/analyze/en.wikipedia.org#tls

      • tialaramex 948 days ago
        Today TLS 1.2 support is less worrying than it might have been so long as your TLS 1.3 implementation is competent.

        TLS 1.3 downgrade protection means that most downgrade attack avenues aren't viable. In earlier versions the downgrade protection was pretty flimsy so it's more of a worry that a server offering TLS 1.0 through TLS 1.2 could be vulnerable to downgrade and then a TLS 1.0-only attack works on clients that could have done TLS 1.2 but were downgraded. Nobody's halfway decent TLS 1.3 client will allow that, if you try to downgrade it, the client gives up instead. Denial of service was always possible, but downgrade needn't be.

        Most of this is actual design work in TLS 1.3 but a fair amount is simply willpower from major clients like web browsers. Deciding that OK, if you insist on downgrading the protocol we would rather tell the user the site is inaccessible than allow that.

  • ajnin 948 days ago
    Got an F because I didn't implement XSS protection on my static non-interactive non-JS website.
    • chrismorgan 948 days ago
      Yeah, −60 for lacking Content-Security-Policy, X-Content-Type-Options, X-Frame-Options and X-XSS-Protection is grossly excessive. As is declaring XFO “mandatory for all new websites, and all existing websites are expected to add support for [it] as soon as possible” <https://infosec.mozilla.org/guidelines/web_security#x-frame-...>. XFO is entirely unnecessary for stateless and actionless websites, CSP of no value on static sites that load no third-party resources, and the rest of them of no value on static sites. You could say they protect against man-in-the-middle attacks or attacks on your static file server (nginx or equivalent), but any competent MITM will modify headers, and with attacks on the static file server you’re hosed anyway. I also think they should significantly downgrade at least XFO (they last touched that descriptive document I quoted over three years ago), because the browsers that want it are all now obsolete (IE11 being the only one that’s even vaguely still in use) and entirely unsupported by many systems.

      I get 40⁄100, D⁺, because of this stuff, and I have not the slightest intention of changing it because I’m stubborn and know that I don’t need them. Well, it’s better than the 0⁄100 it gave me at the start of 2019, or the 20⁄100 back in 2016.

      It needs some sort of profile system for the rankings, so that you can say “anything”, “static site”, “static site with no third-party resources”, that sort of thing, and in the lattermost case have it say “OK, then we’ll just suggest CSP, XCTO, XFO and XXP rather than screaming about them”.

      • geofft 948 days ago
        I mostly agree with you, but I think the strongest argument in favor of using them anyway is that you might add state/actions/third-party resources to your website at some point, and adding those headers now is harmless and a way to make those changes safe.
      • adamrezich 948 days ago
        >I get 40⁄100

        wholly unrelated but I had no idea about the existence of U+2044 FRACTION SLASH!

    • IggleSniggle 948 days ago
      To be fair, you don’t need JS in the exploited website to do XSS.
      • agilob 948 days ago
        I moved my website from wordpress with 20 plugins, comments, disqus and lazy loaded images to static pages generated from markdown using Hugo. It's literally plaintext no JS, no tracking, no cookies at all. I got downgrade from B to D for not having CSP, XSS and XFrames protections. I don't even have forms or JS on my blog anymore. All content is loaded from 1 domain.
      • calcifer 948 days ago
        But they also said non-interactive, so I'm assuming no forms of any kind either.
        • schemescape 948 days ago
          I just confirmed this on my own static site which has no forms or any input whatsoever (and no JavaScript, cookies, or external resources either).

          But I guess I wouldn’t use this tool on such a trivial site anyway.

        • IggleSniggle 948 days ago
          True. Although…you could also potentially use reflected URLs or cookies to pull off some kind of XSS attack.
    • Avamander 947 days ago
      Some of those headers are deprecated anyways and people should be using Content Security Policy. It's silly to subtract points for not using a deprecated header.
    • metalliqaz 948 days ago
      same here. I don't think this tool was designed for our use case
  • m_eiman 948 days ago
    Ok, seems a bit "modern web" focused. My score:

    -25 for not defending against Javascript attacks on my javascript free domain

    -40 for not forcing HTTPS

    -35 more for not protecting against non-existing javascript being manipulated

    • KronisLV 948 days ago
      > -40 for not forcing HTTPS

      For most of the websites out there (that might want to accept user input, or ensure that the page content isn't manipulated or hijacked), that indeed would be a good recommendation.

      What would prevent someone from adding a few <script> tags here and there with ads on your site, or code to spy on your users? ISPs have historically already abused this, here's just one example, though you can look up plenty more through your search engine of choice: https://stackoverflow.com/questions/30505416/eliminate-isp-i...

      Personally, i really dislike that the web has come to this.

      • m_eiman 948 days ago
        > What would prevent someone from adding a few <script> tags here and there with ads on your site, or code to spy on your users?

        Nothing, probably. In a sane country and legal system doing things like that would be illegal. But on the other hand forcing HTTPS means that some users will never be able access it due to old browsers and/or hardware.

        More likely though is that I mess up the HTTPS certificates, either by mistake or inaction, and lock out everyone who doesn't dare click the correct sequence of "ignore warning" buttons.

        I've already managed to block access for normal users to several sites, several times, by running too old certbot versions, not integrating things properly and whatnot. It's a good thing I'll never use HSTS and HPKP, or I'll make permanent messes.

        • Asmod4n 948 days ago
          There are ISPs all over the world who inject their own content in http sessions, mostly ads.
          • acdha 948 days ago
            My front-end JavaScript collection on a global site has recorded traces of just how widespread this is — ads, malware (I'm assuming compromised hardware at the ISP as it was things like an ISP in Brazil injecting ads for a .ru pharmaceutical site), and in the case of what appeared to be an Iranian college's student network, something which appeared to capture more information about the browser and page content after the DOM had fully loaded.
        • marcosdumay 948 days ago
          > I've already managed to block access for normal users to several sites, several times, by running too old certbot versions, not integrating things properly and whatnot.

          To be fair, I've done that on my personal site a few times too, and if HTTPS is broken, I consider the site broken (the same as being offline). It's time for fixing it, not for hacking around an HTTP version.

          That said, I never cared about HSTS and CSP. Those headers are a joke. The correct way to force HTTPS and certificate verification is on the client, not as a hint by the server. Yeah, I will put them there if I ever bother to customize those webserver settings, there just isn't much of a reason either way.

          • acdha 948 days ago
            > That said, I never cared about HSTS and CSP. Those headers are a joke. The correct way to force HTTPS and certificate verification is on the client, not as a hint by the server.

            I agree that HTTPS is more effective than CSP but HSTS addresses the problem of doing that without breaking the web: if you don't have a way for someone typing www.google.com into the browser's location bar or following an old bookmark, you're leaving millions of people vulnerable to potential MITM attacks on the local network. HSTS allows sites to opt-in after they verify everything is working so most traffic is protected long before every antiquated enterprise IT site is updated.

        • penagwin 948 days ago
          I live in the US and have had comcast inject into http requests. I noticed because of a pop-up in csgo's menu (it loads html for their blog)
        • Aeolun 948 days ago
          > It's a good thing I'll never use HSTS

          Always fun when you lock yourself out of your own website for several days.

        • markandrewj 948 days ago
          What old clients exactly? HTTPS is a long supported standard. HTTPS was introduced by Netscape in 1994.
          • mook 948 days ago
            I don't think you'll be able to browse any HTTPS websites today using Netscape (probably any version, not just something from 1994) — Wikipedia lists that as based on Firefox 2.x, and caniuse reports that doesn't support TLS 1.1 (the TLS 1.0 chart appears to be removed for being too obsolete).
            • markandrewj 948 days ago
              Expecting the user to have a browser that supports a recent version of TLS is not unreasonable. My point was that even browsers older then most people imagine support HTTPS, but like you said something newer then Netscape would be required. If the argument being made though is we shouldn't require HTTPS to support users that use clients which don't support HTTPS, my question is still the same, what users and what clients?
              • m_eiman 948 days ago
                Servers no longer support the old TLS versions, though. Or if they do, they get minus points on checks like this.
        • KronisLV 948 days ago
          > Nothing, probably. In a sane country and legal system doing things like that would be illegal.

          It should, but it isn't always the case. Not only that, but even if it is technically illegal, it still might be done because of a lack of people who'll take the guilty parties to court over it. So, in reality, you cannot avoid viewing that as a well founded risk.

          > But on the other hand forcing HTTPS means that some users will never be able access it due to old browsers and/or hardware.

          In a similar argument about what "should" happen - Google shouldn't just abandon numerous Android devices out there, nor should any other vendor. There should be mechanisms in place to ensure that these devices continue to function for the decades to come.

          But since that's not the case, it's inevitable that you'll cut off a small portion of your potential userbase, same as with many sites simply not functioning because the developers made the choice to require JS. Of course, that is your choice, unless other concerns (like security) force your hand.

          > More likely though is that I mess up the HTTPS certificates, either by mistake or inaction, and lock out everyone who doesn't dare click the correct sequence of "ignore warning" buttons. I've already managed to block access for normal users to several sites, several times, by running too old certbot versions, not integrating things properly and whatnot. It's a good thing I'll never use HSTS and HPKP, or I'll make permanent messes.

          I run a couple of sites through a .dev domain and i do agree with what you're saying, since locking yourself out sooner or later is inevitable, but in my eyes i'd treat it like any other problem out there, much like messing up exposing the correct firewall ports - fix the problem, set up monitoring to be alerted of any problems in the future and move on.

          That's why having development/test/staging environments is really useful and in case you fear rate limits, Let's Encrypt also has a staging environment that you can use before switching over to prod: https://letsencrypt.org/docs/staging-environment/

          Not only that, but there are a few web servers here and there that attempt to improve the situation with ensuring SSL/TLS, like Traefik. Personally, however, i've found Caddy to be the most painless, since with it i don't need to mess around with integrating certbot with Apache/Nginx, but instead can just use it, since it works out of the box for the most part: https://caddyserver.com/

          Apart from that, you can always just expose a version without HTTPS on the server's ports locally, so that you can set up a tunnel through SSH and access it from your device in emergency situations (or just use a self signed certificate for the "private" version).

    • hannob 948 days ago
      On your javascript-free domain it would be a good idea to have a CSP policy with script-src 'none'.
      • Avamander 947 days ago
        This blocks some malicious add-ons, btw.
    • OJFord 948 days ago
      But that's fine right because the 'score' doesn't actually matter? If those are the results and you can go through and understanding them know that they don't apply and so it's ok, that's ok?
  • kiryin 948 days ago
    I'm interested in the SSH test, but I use a non-standard port for SSH. I've been lead to believe this is a common practice, is there an option hidden somewhere? From what I can tell it just automatically tries 22 and fails.
  • KronisLV 948 days ago
    This is pretty nice! Added CSP headers and fixed the cookie attributes on my personal site thanks to it, had forgotten about those.

    The CSP analysis section (and maybe some others) could use a bit of improvement. For example, currently you get the following output:

      Clickjacking protection, using frame-ancestors
    
    With the following popover text:

      The use of CSP's <code>frame-ancestors</code> directive offers fine-grained control over who can frame your site.
    
    And yet, nowhere does it recommend you actionable steps. The page that you're probably looking for in that situation might as well be a clickable link: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...

    Thankfully, the recommendation section does have some useful links apart from that! Not step by step guides, but still would lead you in the general direction of what you need.

  • sciurus 948 days ago
    For anyone curious, you can find the source code at https://github.com/mozilla/http-observatory
  • offsky 948 days ago
    Here is a more comprehensive website testing tool.

    https://www.validbot.com

    Full disclosure, this is my project.

    • Seirdy 948 days ago
      Great tool. Some feedback:

      - The section on FLOC is a bit inaccurate; I wrote about what the interest-cohort permis. policy does and doesn't do:

      https://seirdy.one/2021/04/16/permissions-policy-floc-misinf...

      - Safari now supports normal icons for pinned tabs. apple.com no longer uses a mask-icon.

      - X-XSS-PROTECTION should be set to 0 (disable) according to OWASP's latest guidelines, since XSS filtering has been found to introduce new sec vulns:

      https://owasp.org/www-project-secure-headers/#x-xss-protecti...

      - Some implementations of HSTS and auto HTTPS upgrades mandate that HTTP-to-HTTPS redirects don't change the hostname, including the www prefix. If anything, this tool should recommend against a single redirect for HTTP->HTTPS upgrades and www subdomain prefixing/removal.

      - Very, very few browsers do support X-Frame-Options but lack support for CSP; even fewer have a modern TLS stack that works with secure cipher suites. X-Frame-Options no longer should be needed since the CSP header fills its use case.

      I'd recommend taking a look at some existing checkers for reference. Webbkoll, check-your-website.server-daten.de, Hardenize, Lighthouse, and Webhint.io are some good ones.

    • chrismorgan 948 days ago
      Tried it on my site: https://www.validbot.com/report/b6c2b0aec340f6133de16148a495...

      Some of the icon tests are bogus. I deliberately don’t put any <link rel=icon> on my site, but have a favicon.ico containing (among other sizes) 16×16 and 32×32. Your tool complains about meta tags for 16×16 and 32×32 not being found in the HTML. Well, they’re not, but they’re not necessary, because I haven’t put anything in place that would disrupt the favicon.ico fallback. 192×192: … why? Won’t things gladly scale that 512×512 you want down? Manifest and other large icon sizes: this stuff isn’t relevant to all sites. And that’s a problem with these sorts of tools in general, they give scores tuned to a single usage profile which simply isn’t suitable in all cases. As with HTTP Observatory’s XSS stuff commented about elsewhere in this thread. What we need for tools like this is profiles that twiddle rankings. Things like “personal content site” which changes manifest and Apple/Safari/large icons to optional. As it stands, the weighting of this extra stuff is way off base—I get given an F for that section, when I honestly think it should get at least an A, when operating under my hypothetical “personal content site” profile.

      Test 48 is bogus, the <body> start tag is optional.

      Test 111, wanting initial-scale on the viewport meta tag, I’ve been casually searching for someone to confirm what it actually does, and if it’s still needed. Most indications suggest it was basically a workaround for an ancient iOS Safari rotation bug, but I’ve come across at least one person stating (without detail) that it still did something. Any chance you have Apple stuff and can investigate more as to whether it’s actually still at all useful?

      Test 33, DMARC record formatting, looks bogus.

      • offsky 948 days ago
        No general purpose testing tool like this can be a one-size-fits-all sort of thing. In the future I plan on adding configuration options so you can disable tests that you don't care about.

        If you know what you are doing, but all means feel free to disregard any tests that you don't agree with. The suggestions that Validbot makes are meant to be general purpose "best practices" to help web developers make sure they are paying attention to everything they should be. Sounds like you are and have made some good decisions.

    • edoceo 948 days ago
      Why do you want to make TTLs on some things one day? You think an hour is too short?
      • offsky 948 days ago
        For some reason TTL recommendations seem to cause heated debates. In my opinion, it really depends on what sort of website you are making. A Google type website will need different TTLs (among other things) than a personal blog. The point really is to think about it and make a conscious decision instead of just accepting the defaults that your DNS provider uses. I think 1 hour is just fine.
  • hidalgopl 948 days ago
    Had almost identical idea for startup about a year ago. I was thinking about it as a SaaS, but then I figured out there is not enough interest for such product.

    Idea was to run almost same set of checks as tab HTTP Observatory does using CLI I created: sailor. We decided to have it as CLI, for sake of simplicity of integrating it into CI & CD.

    After I decided we won't be trying to build a business around it, I removed SaaS dependency and open-sourced it.

    You can check it here: https://github.com/hidalgopl/sailor

  • exciteabletom 948 days ago
    I was previously using https://ssllabs.com, but this is much more comprehensive! It even includes ssllabs as a third party test!
    • hannob 948 days ago
      Isn't it just very different from SSLLabs? Like SSLLabs is testing for TLS configuration and vulnerabilities, this is testing for HTTP security headers. There's some overlap (HSTS), but for the most part these are just two different tools doing different things.
      • input_sh 948 days ago
        Yup, this is more like securityheaders.com than ssllabs.
  • facorreia 948 days ago
  • tootie 948 days ago
    Is CSP still recommended? I thought it was considered overkill for little benefit
    • rnicholus 948 days ago
      There is enormous benefit with a _strict_ CSP. It's unfortunately common for a CSP that whitelists CDNs, allows eval, etc, etc. These are arguably worse than not having a CSP at all due to the false sense of security. More details in this excellent writeup at https://storage.googleapis.com/pub-tools-public-publication-....
    • doliveira 948 days ago
      A lot of it seems specially targeted towards websites with tons of third-party scripts
      • tootie 948 days ago
        Yeah, I work on some public-facing sites that have analytics and programmatic ads and the like. Our list of script and frame allows would be pretty long. And since the analytics team own and operate out tag manager, they can inject third-party scripts at will without needing a release which makes maintaining CSP a whole job on its own.
      • bleuarff 948 days ago
        Isn't that the majority of the web today?
        • doliveira 947 days ago
          Indeed, but it's messed up that we had to invent all this new standard just to keep including hundreds of Analytics scripts