Adblockers Performance Study

(whotracks.me)

212 points | by kkm 1896 days ago

28 comments

  • magicalhippo 1896 days ago
    FWIW, Chromium devs have just responded[1] to the massive amount of feedback they got on the mailing list, like [2] and [3].

    One of the main pain points raised was the lack of any way to dynamically add rules, as well as the low maximum number of rules allowed (30k). Seems they've decided to support dynamic rule addition, as well as increasing the number of rules, though probably not by orders of magnitude by the sound of it.

    Proof is in the pudding though.

    [1]: https://groups.google.com/a/chromium.org/forum/#!topic/chrom...

    [2]: https://groups.google.com/a/chromium.org/forum/#!topic/chrom...

    [3]: https://groups.google.com/a/chromium.org/forum/#!topic/chrom...

    • pythux 1896 days ago
      Hey, dislaimer: I worked on this study. Thank you for your comment.

      To me this reaction from the Chromium devs is missing one of the most fundamental issues. I'm not fundamentally against the declarative API because of technical limitations; I am against it because it is a strong innovation lock. The current extension ecosystem is flexible enough to allow hundreds (maybe thousands) of people to actively work on privacy-enhancing extensions (ad-blockers, anti-tracking, etc.) and the technologies, heuristics, solutions to protect users' privacy on the Web are constantly evolving. The APIs are not used today the same way they were used 2 years ago. If Chrome decides to "freeze" the blocking capabilities of the browser into a declarative API that no one but Chrome devs can improve, they will be preventing people from finding new solutions to tracking and advertising (at least from extensions). It does not matter if they replicate 100% of the capabilities of today's ad-blockers, as long as it does not allow evolution and adaption it will become obsolete. There is precedent in this matter: Safari also has a similar API and it has been a huge pain for ad-blockers developers. The reason is simple: Apple or Google do not have the same strong intensives that we have to continuously improve the blocking capabilities of the user agent. My fear is that this declarative API will be an ok-replacement for today's content blockers, but will not allow the same kind of fast paced development we benefit from today in the space of privacy extensions.

      • gregknicholson 1895 days ago
        I'm hoping that Firefox is once again seen as a viable alternative, that “power users” would consider switching to if Chrome falls behind.
        • magicalhippo 1895 days ago
          I ditched Chrome 8 years ago, first for Opera and then Firefox once Opera became Chrome. It still has memory issues[1], but overall I'm very happy with it.

          [1]: it's probably not plain old memory leaks anymore, but due to using a few long-lived content processes, pages/scripts that leak is an issue. But usually not a huge deal, once one of the processes start using 2-3GB I just kill it, and refresh the affected tabs (coughSlackcough).

        • simon_o 1894 days ago
          Would love to switch back to Firefox, but the general UX is just too terrible.

          I also lack the trust that even if they fixed the major issues (or even allowed fixing them yourself) I would be able to rely on things working in the mid-to-long-term.

          Currently holed up on Vivaldi, where the things you expect out of the box, are in fact working out of the box (vertical tabs, mouse gestures, ...).

      • magicalhippo 1895 days ago
        I tried to make my comment rather unbiased, but yeah, it's still a step in the wrong direction IMHO.
      • saagarjha 1895 days ago
        Speaking as a user (and early developer) of Safari's content blockers; I have almost never run into an issue with them. What kind of development do you fear will be stifled by Apple and Google not having incentives to improve the blocking (which I find somewhat strange in the former case, anyways)?
        • pythux 1895 days ago
          What I'm afraid of is the following:

          * The blocking engine operated by either Safari or Chrome is a black-box and independent devs will have a harder time understanding it, tweaking it, improving it, debugging it.

          * Chrome devs are now playing nicely and get feedback and propose some improvements to the APIs but there is no warranty this will happen again, or that they will invest time/energy in the future improving this part of the browser.

          * It's harder to work with this API than a JavaScript code-base you control.

          * Chrome seems a bit better here but for Safari the documentation is pretty poor.

          * You also don't get feedback regarding the rules which matched on a page and this makes it harder to debug or give nice insights to users.

          That's only a few points from my personal experience but I discussed multiple times with developers of other privacy-enhancing extensions/apps and we shared similar feelings.

          • zaro 1895 days ago
            > Chrome devs are now playing nicely and get feedback and propose some improvements to the APIs but there is no warranty this will happen again, or that they will invest time/energy in the future improving this part of the browser.

            I think this is especially true. It is somewhat similar to many other Google products like Maps and Translate. They start as a good free product, but as soon as they gain enough traction the rules change. I think once this declarative Api is the standard for ad blockers in browsers Google will start exercising its control over it for its own benefit.

            • takeda 1895 days ago
              This is their long game. To me all the push Google did with https, and certificate pinning etc makes much more sense. I was wondering why they were pushing it so hard.

              I mean after they essentially blocked ways to use proxy to filter the content, next logical step is to restrict API.

              • jefftk 1895 days ago
                If you want to proxy your HTTPS traffic you add a local CA, and Chrome does not apply certificate pinning. Pinning is only for certs that chain back to the default CAs, specifically so people who need to proxy can do so.

                (Disclosure: I work for Google, though not on Chrome)

                • takeda 1893 days ago
                  Sure, but then you're still at the mercy of the browser.

                  The API change is totally unnecessary, yet is happening despite many protests.

                  The concern is that it was performance and privacy issue, which looks like a total BS (even according to the link we are discussing).

                  The extensions are installed by the user, so what not let them decide what to do with their browser? If it's really a concern, I don't think anyone would oppose if google would educate user what API given extension is using.

          • saagarjha 1895 days ago
            > The blocking engine operated by either Safari or Chrome is a black-box and independent devs will have a harder time understanding it, tweaking it, improving it, debugging it.

            I mean, both engines are open-source, but yeah, I do agree that it would be nice it have this enshrined in a web standard rather than a de-facto one driven by the shins of two large corporations.

            > You also don't get feedback regarding the rules which matched on a page and this makes it harder to debug or give nice insights to users.

            This seems like an easy tooling problem to fix.

        • takeda 1895 days ago
          The sites that display ads, wants to make sure ad blockers can't block them.

          Currently both sides adapt, if the way it works is locked down ad blockers quickly will become obsolete.

          For example a whole ago most sites were creating popups with ads, after it became bad, browsers started blocking popups, initially by only displaying then when user actively clicks. So sites started opening a popup when user made a first click anywhere on the page, eventually browsers started blocking all popups and just notifying user that popup was triggered, giving them choice whether they want to see it.

          This solved the old pop-ups, but because of that a new popups were created, that use CSS to show it within browser window covering the text. In addition to that, the CSS layers were used to implement other attention grabbing mechanisms, like ad that stays in place even when you scroll, or suddenly ad appears between text etc.

          This was targeted by ad blockers, which constantly adapt. Most ads are served from different domains, since typically the ad content is provided by different company, but increasingly we see ads being served from the same domain as rest of the website, or website randomizes CSS component id etc.

          What chromium authors are doing is to instead provide API for ad blockers to use and list their rules and let the browser do the blocking. Supposedly that's to improve performance. The problem with it is that it will essentially fix adblockers in one place, they no longer will be able to adapt, eventually new kind of ads will start to show up, that adblockers won't be able to block. And now with other changes that Google successfully pushed, such as https everywhere, http/2 and http/3 is nearly impossible to block ads through a proxy.

          This is why I personally prefer to stick with a Firefox.

          • roblabla 1895 days ago
            > And now with other changes that Google successfully pushed, such as https everywhere, http/2 and http/3 is nearly impossible to block ads through a proxy.

            I agree with everything except this. First, http/2 and http/3 absolutely do not prevent blocking. If blocking proxies don't support them, then they're the ones lagging behind.

            Secondly, most blocking software working on the network layer use DNS, which still works just fine and will likely continue to work forever.

            Thirdly, you can still, for the most part, MITM https connection on devices you own. You just need to install your own root ssl certificate. The only thing that prevents this from working would be HSTS preloading.

            EDIT: Actually, adding your own root cert bypasses HSTS preloading.

            • sm4rk0 1895 days ago
              > most blocking software working on the network layer use DNS, which still works just fine and will likely continue to work forever.

              Don't say that twice. Have you heard about DNS over HTTPS? I'm using XPrivacy on Android and have noticed that applications that use Android System WebView (based on Chrome) started making requests to 8.8.8.8, 1.1.1.1 and other public DNS services. It's still possible to block domains via hosts file, but I bet it's a matter of time when Google decides it's "in our interest" to start using their DNS instead of ISP's one.

            • takeda 1893 days ago
              Both http/2 and http/3 made TLS mandatory.

              The DNS part was already handled by someone else.

              As for the MITM https you are at the mercy of the browser, they are depreciating the API they might restrict this as well if it will become the way to do filtering.

        • the8472 1895 days ago
          The difference is categorical. The current web request APIs can delay approval/rejection of any request for an indefinite amount of time to perform arbitrary computation (including IO or talking to other extensions) to decide. It can also be stateful.

          This doesn't just knock the power of your adblocker down from Type-0 to Type-4 of the chomsky hierarchy, it also limits the inputs it can act on.

          As an example, if I wanted an adblocker that looks at the DOM or javascript state of a page before allowing a request to load an iframe this would have to happen asynchronously since it means communicating with the page context. You can't do this declarative style.

          Or if one wanted to implement a "click to play" style tool for iframes one would have to hold the request indefinitely until the user approves. This probably isn't a good idea for technical reasons, but at least it is a possibility with current APIs.

          • saagarjha 1895 days ago
            > The current web request APIs can delay approval/rejection of any request for an indefinite amount of time to perform arbitrary computation (including IO or talking to other extensions) to decide.

            I believe this is the reason Safari introduced content blockers. It fits in very well with the traditional computing model on iOS of preventing unbounded, arbitrary computing where possible.

            • the8472 1895 days ago
              I don't see how that's a good thing. If latency is a concern you can always inform the user what is causing slowdowns and let them decide whether the functionality is worth the cost or not instead of taking the choice out of their hand.

              And even if we suppose that it is a reasonable policy for a moment, it's still not all that relevant since we're not talking about the apple ecosystem here in the first place.

        • tyingq 1895 days ago
          Safari's content blockers supported adding rules dynamically from the start.

          Google tried to roll this out initially without that obvious must have. It speaks to intent, and probably future prospects for the API.

          And, of course, a declaritive API with pattern matching limits what you can do anyway. No heuristics, no behavior based blocking, etc. You are pretty much stuck with cataloging the patterns of a ton of websites as your only approach.

      • kkm 1895 days ago
        In addition to this, let's also keep in mind the huge cost of maintaining two different code bases of extensions for different browser versions.
    • danShumway 1895 days ago
      Dynamic rule addition addresses a very small number of complaints with the proposal.

      Beyond the ability to block requests conditionally based on arbitrary logic (not just a few pre-decided qualifications like request size), one point I want to keep coming back to is that there are actually legitimate reasons why an extension might choose to slow down requests. I use Tampermonkey scripts on a couple of social networking sites deliberately to slow them down so I'll that I'll be less likely to impulsively refresh them.

      I continue to believe that the manifest changes aren't written with the perspective of enabling creative, unseen uses of the API in the future. They're written from the perspective of, "let's decide up-front what extensions we want, and enable specifically them."

      The feedback people have given on this is extremely broad, and is mostly ignored by this post. It's disappointing to see a response that at least somewhat suggests the Chrome team is dead set on shipping this, and is only willing to bend so far as it takes for them to enable the most popular adblockers that exist today. If it took that much feedback to get Chrome to even slightly tweak the design, then what possible feedback can people give going forward to make anything more significant happen?

      • kbenson 1895 days ago
        Tampermonkey being limited is huge for us. I used to actually implement a chrome extension for work to automate interaction with some partner sites, but tampermonkey key is so much easier to manage, test on, and update for. I'm dreading having to go back to extensions, especially since all of them will have a review process so there's yet another hurdle.
      • SquareWheel 1895 days ago
        >I use Tampermonkey scripts on a couple of social networking sites deliberately to slow them down so I'll that I'll be less likely to impulsively refresh them.

        Could you use Chrome's simulated network throttling instead?

        • danShumway 1895 days ago
          Do you mean through the dev tools? That would require me to leave the dev tools open while I browsed. I would also need to manually turn it on, which defeats the purpose of it being an automatic thing that interrupts an instinctual behavior.

          I don't see an API anywhere that makes network throttling available to extensions, but let's assume Chrome adds one.

          In that case, it still lacks granularity -- I only want to slow down some requests on some sites. One thing I've been thinking about doing if I turn this into its own extension is having it respond to your aggregate time on a site. So the more time you spend on a social site, the slower it gets, but if it's been closed for a while it starts to "recharge" and speed up again. In particular I'm thinking about that for sites like Twitter, where I don't mind checking it so much as browsing it.

          There's a lot of interesting stuff that's possible with the current API that can't be replicated by just saying, "slow down everything across the board."

    • gpm 1895 days ago
      > Users need to have greater control over the data their extensions can access.

      Ok, so then why

      > In particular, there are currently no planned changes to the observational capabilities of webRequest

      That's before the fact that you consider that webextensions can run arbitrary code on the page and extract whatever information they want.

      Edit: Also

      > Increased Ruleset Size: We will raise the rule limit from the draft 30K value. However, an upper limit is still necessary to ensure performance for users.

      But, as per the article

      > All content-blockers except DuckDuckGo have sub-millisecond median decision time per request.

      So that doesn't make any sense as a justification either

      • userbinator 1895 days ago
        I find arbitrary limits like this to be totally idiotic. The number of rules does not need to fit in a 16-bit variable. The "to ensure performance for users" is simply patronising and attempting to divert attention away from the fact that they are trying to neuter this feature as much as they can without raising too much opposition. I should be able to filter millions of rules if I have the RAM and CPU power available.
    • tyingq 1895 days ago
      "Seems they've decided to support dynamic rule addition"

      I find it telling that it was left out in the first place. I can't think of any plausible reason to have omitted it, other than to purposely hobble adblockers.

    • SquareWheel 1895 days ago
      Has Chromium's response been posted to HN yet? I haven't seen it as a headline.
  • Dahoon 1895 days ago
    So Ghostery benchmarks and is the fastest? Fine. But Ghostery does not belong in that test at all. Of course it can be faster. It misses a lot of the most important features! Like custom lists. Totally an ad.
    • robertAngst 1895 days ago
      HN is like 50% ads, 50% programmers.

      Enjoy the good, ignore the bad.

    • skilled 1895 days ago
      Haha, the whole article is focused on Ghostery against Them.
    • r3bl 1895 days ago
      Scroll to the footer and you'll see that this is Ghostery's website.

      I wouldn't really call promoting their own product on their own webpage an ad.

      • lexicality 1895 days ago
        It's still an advertisement for them, it's just not shilling (or "paid content" or whatever fake blogs are called these days.)
      • kakarot 1895 days ago
        A rose by any other name would smell as sweet. Ghostery doesn't want their bias to be obvious to the non-discerning user because it's obviously an attempt to reclaim market share using dirty tactics.
        • tyingq 1895 days ago
          Fighting one dirty tactic (manifest v3) with another, though. I'm more alarmed about the former.
          • kakarot 1895 days ago
            If you're alarmed about Manifest v3, we welcome you into the Firefox fold.

            Of course we've had our own issues with WebExtensions but since it's not politically motivated these issues will hopefully be resolved.

          • Dahoon 1894 days ago
            How are they "fighting one dirty tactic with another"? Clearly they are saying that the whole thing isn't a problem at all! They are fighting for manifest v3 with dirty tactics if anything.
      • Dahoon 1894 days ago
        They don't state they are testing their own product as far as I can tell. Strike one. They use a domain that says nothing about them being one of the tested products authors. Strike two. The whole thing is totally biased, testing apples against oranges and then posted to HN. Most papers would force them to have a "This is a paid ad" somewhere. Strike three and out.
  • move-on-by 1895 days ago
    Thanks for the interesting read! Just right off the bat, I’m a little cautious of benchmarks done by Ghostery that happens to show Ghostery is incredibly fast. Not that anything else stood out to me as suspicious, just a comment. Perhaps they are The fastest _because_ they are benchmarking and have fixed bottlenecks.

    Beyond all that, I’m a huge fan of ad blockers and use them as an attempt to reduce tracking and targeting and it’s abundantly clear to me that they greatly speed up many webpages. The amount of junk that so many sites load, no doubt it even saves me data on my data plan.

    It seems like common sense to not trust a company whose main income is based on advertising to be making decisions on ad blocking.

    • pythux 1895 days ago
      Thanks a lot for your comment. The project is open source and everything needed to run the study comparisons is available as well there: https://github.com/cliqz-oss/adblocker/tree/master/bench/com...

      We would be really happy to see people running the same benchmarks and try to reproduce the results!

      • alextooter 1895 days ago
        Hi,thanks for this fast ad-block software. After I give it a try,it does not provide an option let me subscribe easylist or something like that.Then I open a website,lots of ads are still there,right click on ad image,there is no option to remove them.

        From a user opinion,it may not ready for everyday using.

  • mirashii 1896 days ago
    > This work was motivated by one of the claims formulated in the Manifest V3 proposal of the Chromium project: "the extension then performs arbitrary (and potentially very slow) JavaScript", talking about content-blockers' ability to process all network requests. From the measurements, we do not think this claim holds, as all popular content-blockers are already very efficient and should not incur any noticeable slow-down for users.

    There's a few issues with the conclusion here. First, they article measures and discusses only the time required to block a single request. Modern web pages are issuing many, many more requests than that, like the 35 that this page issues. At median timings, that would put the DuckDuckGo blocker at almost 300ms, well within what humans can notice.

    The second is that this API is not used solely by the popular content blocking extensions, but by a variety of other extensions. The Chome team's performance concerns likely stem from the fact that a user won't be able to differentiate the browser slowing down and an errant extension slowing down the network requests, and there are examples of extensions that use this API to issue additional network requests or do slow things down. If you cherry-pick the good citizens of this API to show that performance isn't a problem in general, you're not showing that performance can't or shouldn't be the reason, just that it isn't the reason for the fast good citizens. What this data could be used to argue is that imposing strict deadlines on the execution time of these extensions would allow the content blockers that the community cares about to continue to function as they do today while also placing a performance cap on bad extensions.

    • pythux 1895 days ago
      Thanks for the replay. Disclaimer, I worked on this study.

      > There's a few issues with the conclusion here. First, they article measures and discusses only the time required to block a single request. Modern web pages are issuing many, many more requests than that, like the 35 that this page issues. At median timings, that would put the DuckDuckGo blocker at almost 300ms, well within what humans can notice.

      That is true, but on the other hand the DuckDuckGo blocker is the exception here and they could likely improve this performance if it becomes their focus (one way would be to use one of the faster open-source alternatives). If you consider uBlock Origin, Adblock Plus or Ghostery, we see that even blocking 100 requests would not take much time (probably around 1 ms with Ghostery).

      > The second is that this API is not used solely by the popular content blocking extensions, but by a variety of other extensions. The Chome team's performance concerns likely stem from the fact that a user won't be able to differentiate the browser slowing down and an errant extension slowing down the network requests, and there are examples of extensions that use this API to issue additional network requests or do slow things down. If you cherry-pick the good citizens of this API to show that performance isn't a problem in general, you're not showing that performance can't or shouldn't be the reason, just that it isn't the reason for the fast good citizens. What this data could be used to argue is that imposing strict deadlines on the execution time of these extensions would allow the content blockers that the community cares about to continue to function as they do today while also placing a performance cap on bad extensions.

      There are indeed examples of extensions doing bad things: collecting private data, etc. But we are talking about diminishing the potential privacy protection of all users to prevent some abuse. On the other hand, the manifest v3 will not prevent extensions from being slow, or doing bad things. Extensions will still be able to use content-scripts, inject arbitrary content in pages or send any private data home. WebRequest listeners will also still be accessible (only not in blocking mode) which can allow any data collection.

      So yes, I think these changes could in theory prevent some cases of abuse, but I strongly believe that they will overall weaken the privacy protection of users and that this is not an acceptable trade-offs.

      • saagarjha 1895 days ago
        > On the other hand, the manifest v3 will not prevent extensions from being slow, or doing bad things. Extensions will still be able to use content-scripts, inject arbitrary content in pages or send any private data home. WebRequest listeners will also still be accessible (only not in blocking mode) which can allow any data collection.

        I can choose to not install extensions that do these things, while using ones that only have a manifest list…

        • pythux 1895 days ago
          I agree with you, and that's why I would like to see this declarative API being an addition to the current WebRequest API. This way there could be extensions using it exclusively and users could decide to pick these if they offer sufficient privacy protection for their taste. On the other hand, you would still have the option of installing more powerful extensions using the dynamic APIs (allowing things which will never be possible with the declarative API).
          • saagarjha 1895 days ago
            Yup, I’m not saying that the old API should be killed; it’s just that it’s convenient to have the new one and be able to “trust” the extension to not be able to slow down my browsing, steal information from the page, etc. but also be able to fall back to something I do trust for what gets through if necessary.
    • dasuisa 1895 days ago
      As you suggest, there are many ways to punish the bad citizens (slow or malicious extensions) without harming the good ones, which is what manifest v3 will do. You can solve with UX (e.g. showing that an extension is slowing down a page), or imposing strict guidelines on extensions for webrequest API times. Google is already quite strict on many aspects for extensions (e.g. obfuscated code was forbidden recently), so I don't see why they cannot be strict also with performance.
      • mirashii 1895 days ago
        Right, but again, if that's what you want to argue, this article is not the way to do it. Concluding that the performance claim doesn't hold because the top performers are performant is different than saying something along the lines of "While performance may be an issue in some cases, it clearly does not have to be to achieve this use case we care about, and so we should think about alternative solutions".

        Arguing this correctly is extremely important if the community wants to change Google's mind, and the miscellaneous fear-mongering and accusations of bullshit motivations that are flying around this and other threads can be easily dismissed if we aren't more careful about how we conclude analyses like these.

    • the8472 1895 days ago
      Slowing down requests can be intentional, so even putting a cap on it would limit legitimate usecases. Instead the information should simply be surfaced to the user so they can decide to remove the extension if they don't want the slowdowns.
  • seanwilson 1895 days ago
    > This work was motivated by one of the claims formulated in the Manifest V3 proposal of the Chromium project: "the extension then performs arbitrary (and potentially very slow) JavaScript", talking about content-blockers' ability to process all network requests. From the measurements, we do not think this claim holds, as all popular content-blockers are already very efficient and should not incur any noticeable slow-down for users. Moreover, the efficiency of content-blockers is continuously improving, either thanks to more innovative approaches or using technologies like WebAssembly to reach native performance.

    Not commenting on if the proposed V3 changes are a good idea or not but aren't the changes not supposed to prevent badly written extensions from slowing everything down?

    It might be the case these adblockers are well written but you can't guarantee that for all extensions. If a user doesn't have an obvious way to know it's the fault of an extension then Chrome gets the blame.

    By the way, I have a Chrome extension that from its own tab requests many pages from arbitrary domains to examine their content. I need to modify the request headers going out and observe the response headers coming in, but only for requests made from the extension (so it's not impacting other tabs at all). I'm guessing V3 impacts this only if the request header modifications are dynamic and can't be added via the fetch API?

    • pythux 1895 days ago
      > It might be the case these adblockers are well written but you can't guarantee that for all extensions. If a user doesn't have an obvious way to know it's the fault of an extension then Chrome gets the blame.

      It has been suggested in another comment that Chrome could give visual indications of the performance of extensions. They already do some of this to track the memory used.

    • Proven 1895 days ago
      > If a user doesn't have an obvious way to know it's the fault of an extension then Chrome gets the blame.

      If the network or DNS or system is slow then Chrome gets the blame.

      Perhaps they shouldn’t install anything if they’re incompetent to use it.

  • dasuisa 1896 days ago
    Perfect way of showing that Google's performance argument is bullshit: just measure. And congrats on being faster than uBlock Origin, not an easy feat.
    • clouddrover 1895 days ago
      > congrats on being faster than uBlock Origin

      I'd like to see a comparison against the WebAssembly build of uBlock Origin though. uBlock Origin uses WebAssembly in Firefox to speed up some functions. I'm not sure if Chrome allows add-ons to use WebAssembly yet.

      • dasuisa 1895 days ago
        Ah, that's an interesting point, did not know that it could not use WebAssembly in Chrome yet (https://github.com/WebAssembly/content-security-policy/issue...). Would it be interesting to measure indeed, although the amount of wasm code there seems to be minimal so far.
        • pythux 1895 days ago
          As far as I know WebAssembly in uBlock Origin is currently used for two things:

          1. Matching the $domain option using an optimized Trie data-structure

          2. Parsing domains using the public suffix list, also based on a Trie data-structure

          I would love someone well-acquainted with the uBlock Origin code-base to update the benchmark so that we can compare.

          Edit: All the code to create the dataset (as well as the dataset used for the study itself) and to run the benchmark and analyze the results (create the plots, etc.) is available on the repository and should be reasonably easy to run locally.

          • gorhill 1895 days ago
            The same code used to match the `domain=` option is also used to match all filters which are essentially just a plain hostname, i.e. `||example.com^` -- which is a majority of filters found in filter lists.
        • kkm 1895 days ago
          afaik, in Chrome Web Assembly works fine when used inside Workers.
    • mirashii 1895 days ago
      > Perfect way of showing that Google's performance argument is bullshit: just measure.

      I want to repeat here that this _does not_ show that the performance argument is bullshit. What we can say that it shows is that existing content blockers whose functionality the community is concerned about losing are largely performant enough that performance isn't a concern for that subset of extensions.

      We can in turn conclude that there may be other ways of solving the general performance concern that the existing API poses by other clever means, such as limiting execution time and disabling extensions that violate those deadlines, or UX changes that allow the user to more readily distinguish which extensions are causing performance issues.

      Continuing to push the narrative that the performance concerns are bullshit and a pretext for disabling ad-blockers is hurting the cause of allowing the existing APIs to continue to exist.

      • danShumway 1895 days ago
        > Continuing to push the narrative that the performance concerns are bullshit

        There are two ways to interpret Google's claims about performance:

        A) Content blocking in general is impossible to do quickly without a declarative API.

        B) The current API gives bad extensions too much power to slow down pages.

        We already know that argument B is wrong, because the changes Chrome is proposing don't prevent extensions from slowing down the page. If Chrome was completely, 100% deprecating the old API, I think B would be a much, much stronger claim. But they're not.

        So that leaves argument A, which is exactly what this article attacks. It is possible to build adblockers with the current APIs that are good enough that performance doesn't matter.

        Now, if the Chrome team wants to argue that extensions have too much power and malicious actors can do bad things, that's a legitimate claim to make that I honestly kind of agree with. But that's so clearly not what the point of these changes are, because a set of changes that were focused on that would look very different from what we've gotten. It's the same reason why I don't take Google's privacy claims seriously for the manifest -- because they haven't actually deprecated any of the worst features that allow people to spy on me.

        If someone claims that they need to buy a new car to help save the environment, and then they show you an ad for a pickup truck, I think it's reasonably safe to assume the original claim was just an excuse.

    • kkm 1896 days ago
      According to the @pythux - author of the post, some neat tricks used: https://twitter.com/Pythux/status/1096516371163295750
  • ac130kz 1895 days ago
    Ghostery does not support custom external lists, which is why it is useless
    • alextooter 1895 days ago
      True,I can't using local AD list,it doesn't work like adp or ublock,which is not user friendly.
  • pjc50 1896 days ago
    Background: https://bugs.chromium.org/p/chromium/issues/detail?id=896897... : a proposed change to Chrome that would limit adblock extensions.
    • tedivm 1895 days ago
      It also screws over people who use Tampermonkey.
  • kacamak 1895 days ago
    Worth keeping in mind here is that Ghostery is proprietary, while uBlock is free software. You should never trust proprietary extensions with your data.
    • kkm 1895 days ago
      Ghostery is open-source for almost an year now, You can check the code and repo: https://github.com/ghostery/ghostery-extension
    • lordlimecat 1895 days ago
      Unless you audit the source of every piece of software you use as well as that of every compiler used in making their binaries, you aren't in a position to make that sort of absolute statement. OSS has benefits but for most users they are delegating the code review to someone else, which makes it very similar to proprietary code.

      There is an argument to be made about motives with free vs non-free software, but open vs closed source is for most people much less important.

      • commoner 1894 days ago
        Transparency in software releases isn't all-or-nothing. The parent comment makes a good point about preferring more transparency (free and open source software) over less transparency (proprietary software) as a consumer.

        However, the parent comment is a bit misplaced since Ghostery appears to be released under the Mozilla Public License 2.0.

        https://github.com/ghostery/ghostery-extension/blob/master/L...

  • zaro 1895 days ago
    This is great news, but I think most people already using ad blockers know that even slow ad blocker improves loading speed of web pages. Even if blocking a request takes 10ms, it's still a win when it block something like a tracker which records every click and mouse movement on the page.
  • ra7 1896 days ago
    Can we not have editorialized title please?
    • QuercusMax 1896 days ago
      Yeah... that's barely even what the article is about, based on a brief skim. They seem to disagree with what Manifest v3 is trying to accomplish, but the article doesn't even use the word "bullshit".

      (Disclaimer: I work for a non-Google Alphabet company.)

      • btown 1895 days ago
        Agree with the need not to editorialize, but that is indeed what the article is about. The entire impetus for measuring the performance impact of ad blockers is to disprove one of the two cited reasons for Manifest v3, namely that there are real-world measurable performance benefits to preventing ad blockers from intercepting web requests.
      • kkm 1896 days ago
        Thank you for the feedback, changed the title to match the title of the blog.
  • saagarjha 1895 days ago
    > This work was motivated by one of the claims formulated in the Manifest V3 proposal of the Chromium project: "the extension then performs arbitrary (and potentially very slow) JavaScript", talking about content-blockers' ability to process all network requests. From the measurements, we do not think this claim holds, as all popular content-blockers are already very efficient and should not incur any noticeable slow-down for users. Moreover, the efficiency of content-blockers is continuously improving, either thanks to more innovative approaches or using technologies like WebAssembly to reach native performance.

    I don't think it's valid to debunk this claim without testing the speed of manifest/content blocker list-based blocking.

    • loeg 1895 days ago
      I believe this article does debunk the Google performance claim. It doesn't really matter if the "manifest" system is perfectly fast (exactly 0 seconds); this article shows that the current blockers are fast enough that they are indistinguishable from zero. There is limited room for improvement here (so limited as to be effectively none).
      • mirashii 1895 days ago
        Again, it doesn't debunk it because the claim isn't "Current content blockers are not performant", it is that the API allows extensions to do things which cause performance issues, and there's a long tail of extensions that use this API and do create human noticeable delays. If you cherry pick performant examples and then try to debunk the whole landscape, it simply doesn't work.
        • loeg 1895 days ago
          > Again, it doesn't debunk it because the claim isn't "Current content blockers are not performant", it is that the API allows extensions to do things which cause performance issues, and there's a long tail of extensions that use this API and do create human noticeable delays.

          Both of these arguments are easily rebutted and have already been in this thread. As others have pointed out, the modified API still allows extensions to do things which cause performance issues, just not in that particular path. (Also, preventing ad load can improve page load performance so much that even a "slow" adblocker may make up the difference anyway.)

          > If you cherry pick performant examples

          I don't think these examples are cherry-picked; they're among the most popular adblockers in the landscape:

          * https://www.tomsguide.com/us/pictures-story/565-best-adblock...

          * https://www.digitaltrends.com/web/best-ad-blockers-for-chrom...

          As others have pointed out, you can measure and break or shame poorly performing blockers without punishing the ones that work well. So:

          > and there's a long tail of extensions that use this API and do create human noticeable delays.

          There's a long tail of extensions that are poorly behaved in general. You can punish those ones without killing the ones that don't suck.

    • pythux 1895 days ago
      Hey, disclaimer: I worked on this study.

      I would love to hear your thoughts about this. From our point of view, the argument is not that the native content blocker is slow or fast but instead that it's not an acceptable trade-off, for different reasons (see link below). We also show that some of the most popular content-blockers are more than fast enough in this regard. I would love Chrome to propose this declarative API as an addition to the current WebRequest APIs and incentivize developers to use it when it makes sense. But replacing the current WebRequest's blocking capability with this would prevent extensions from protecting users as efficiently as today.

      I wrote a bit more about other reasons I think the declarative API as only blocking capability of the browser is not a good thing IMO: https://news.ycombinator.com/item?id=19175265

  • userbinator 1895 days ago
    I'm curious how filtering proxies like Privoxy, Proxomitron, Proximodo/Proxydomo, etc. compare --- it's an extra (local) hop of network latency, but those are pure native code. I've been using one for a long time (ever since I heard of them) and it effectively works across all the browsers on the system, even those built-in to other apps (often only for the purpose of showing ads...) Even for those who don't routinely use multiple browsers, given how increasingly user-hostile and unconfigurable they are becoming, I think it makes sense to move filtering into its own application.
    • 15DCFA8F 1895 days ago
      The problem with those external HTTP filtering proxies is that the usage of SSL/TLS is pervasive nowadays.
      • userbinator 1895 days ago
        At least some of those support MITM using a local CA.
  • lohszvu 1895 days ago
    Ghostery is owned by an advertising company. It does not allow custom lists.
  • brianpgordon 1895 days ago
    I'm surprised that Brave performs so poorly. Isn't the whole point of having a dedicated privacy-aware browser supposed to be so that the ad/tracker blocking code can be written directly in C++ and not have to run in Javascript and talk over plugin APIs?

    Maybe these results are only applicable to the desktop version of Brave?

    • bbondy 1895 days ago
      Some things are very wonky with the experiment:

      Brave is intentionally slow on parsing and do as much work there because it doesn't parse from client code, it only use already parsed lists from memory.

      "The memory usage of Brave could not be evaluated using the devtools and thus is not included in this section." That doesn't make sense, I wonder if it's maybe using a very old version based on the old muon code base? If you can get the memory from Chrome you can get it from Brave.

      No information was given about versions that were tested.

      Total parsed rules is too small.

      • pythux 1895 days ago
        > Some things are very wonky with the experiment:

        Thank you for taking the time to read this study. We do not think we claimed anything that was false in this study (although the scope might not be as wide as some would expect or desire); this is not a reason to be dismissive. We have ourselves a lot of respect for the work done at Brave.

        > Brave is intentionally slow on parsing and do as much work there because it doesn't parse from client code, it only use already parsed lists from memory.

        That was indeed one of the things measured, but not the most important one. In fact we explicitly say that this is a one time operation and does not necessarily matter, especially if as you suggest you can perform this work backend-side and ship the serialized version to clients. What is more interesting is the time it takes for matching requests.

        > "The memory usage of Brave could not be evaluated using the devtools and thus is not included in this section." That doesn't make sense, I wonder if it's maybe using a very old version based on the old muon code base? If you can get the memory from Chrome you can get it from Brave.

        If we got this thing wrong we would be very happy to update the results with the correct measurements. The version we used was the latest version from `master` on the following repository: https://github.com/brave/ad-block

        > No information was given about versions that were tested.

        This is indeed unfortunate and we will be correcting this. The measurements were performed last week with the latest version of each projects but we should definitely indicate the exact version used.

        > Total parsed rules is too small.

        Too small for what exactly? Easylist is one of the most popular lists and it's pretty common to use it as a base-line for comparison. It is trivial to re-run the experiment with different lists given that all the code is open-source.

    • Ragib_Zaman 1895 days ago
      > All blockers except uBlock Origin are available as JavaScript libraries which can be loaded in Node.js

      I may be incorrect, but I think their testing methodology involved loading a JavaScript version of the Brave adblocker and ran benchmarks inside the Node.js runtime, and indeed the results may be different if they had used the Brave browser itself written in C++.

  • Tsubasachan 1895 days ago
    You know what slows down browsing the internet?

    Ads.

    • pmarin 1895 days ago
      Javascript.
  • DeepYogurt 1895 days ago
    TIL: DDG has an adblocker. Good to know and I hope they can improve it.
  • lvs 1895 days ago
    Only part of the argument with ad blockers is speed. The rest of the argument is trust. So, how do you make money?

    As far as I know, ublock Origin has no profit motive whatsoever. Correct me if I'm wrong.

  • StreamBright 1896 days ago
    Also, how about blocking on the DNS level? Why does this article concerned with in browser ad blocking only? I bet it is much less resource utilisation if you do it in lower layers.
    • kkm 1896 days ago
      - As pointed out in the previous comment, with DNS based blocking users often have to choose between convenience and privacy. As an example: Blocking ads on Youtube with DNS blocking is very hard, unless you block youtube* itself.

      - It does not give you fine grained permission: Like, if you only want to allow a particular third-party on a specific site.

      While DNS based blocking is fast but at the same time it is also difficult for an average user to surf the web anonymously using the same.

      • StreamBright 1895 days ago
        Youtube is an interesting example, I use it less and less because of ads but can you block ads in it with anything? The problem for me is that I have several devices that do not run JS (like TV box) and do not use a browser for watching content. For this reason it is probably the only solution to block bad content on the DNS level. I like to pay for what I get (Netflix) and I reject the entire ad based surveillance capitalism as a whole.
        • Bjartr 1895 days ago
          I pay for YouTube (via a Google music subscription) and never see YouTube ads on any device I'm logged in on.
          • StreamBright 1895 days ago
            You pay + they collect all of your data (unless your explicitly disallowed some of it). I think it would be only fair if you pay and there is no data collection (by any means, no 8.8.8.8 forced use etc.)
          • kkm 1895 days ago
            Nice, is this the YouTube Red subscription?
            • LeoPanthera 1895 days ago
              It's called YouTube Premium now, but yes.
        • kkm 1895 days ago
          On platforms which supports extensions it's easy to block. For platforms not supporting extensions it is really tricky and afaik, there is no solution that works all the time for YouTube.

          With the increasing adoption of DoH it will be interesting to see how DNS based blocking happens for TV Box etc.

          I have not checked but was mentioned that YouTube does not show ads while watching videos via chromecast. It will be interesting to check how that happens.

        • stordoff 1895 days ago
          With a combination of Pi-Hole and uBlock Origin, I never see ads on YouTube.
        • darkpuma 1895 days ago
          uBlock Origin blocks youtube ads, I haven't seen any for years.

          However that doesn't solve the problem of the web player sucking. To solve that, your best option is mpv/youtube-dl.

    • thetinguy 1896 days ago
      DNS adblocking sucks because it takes too long to turn off when a website breaks because of ads.
      • StreamBright 1895 days ago
        On contrary, I do not want to use any website that breaks because of ads.
        • kkm 1895 days ago
          Yes, that's another way to look at it. But it is not necessarily true for all the users.
          • StreamBright 1895 days ago
            I understand, I never implied that it is for all users, I was just mentioning that there are other options than JS ad filtering.

            Btw. DNS based filtering works for non tech-savvy users like my parents who would certainly fall for malicious ads which are frequent even on Google. And guess how ransom-wares spread.

            https://www.zdnet.com/article/skype-served-up-malware-throug...

            • kkm 1895 days ago
              Yes, I think there are certain domains specially spyware, malware that need to be blocked altogether and DNS blocking is the most optimal way for that, specially considering the wide adoption of IoT, where extensions cannot run.
      • stordoff 1895 days ago
        I don't find that. Logging into my Pi-Hole and disabling it temporarily takes about 10 seconds, and with how infrequently it happens it's a non-issue.
  • codedokode 1895 days ago
    > All benchmarks were ran on an X1 Carbon 2016 (i7 U6600 + 16 GB) in Node.js 11.9.0.

    They should test on average hardware and not on the top one. Take a 2013 year Celeron or Atom with 2 Gb of RAM and HDD and test on it.

    Regarding privacy, I think the code that blocks network requests could run in an isolated environment so that it cannot send the information about requests outside.

    • barrkel 1895 days ago
      I think it should be on a desktop class machine. I'm less concerned about absolute numbers and more with variance. Laptops generally cannot dissipate all the heat from full CPU usage for more than a minute or two. Benchmark runs, if they exercise the CPU hard for any length of time, get thermally limited and the results end up with substantially more variance.

      Even things as simple as build times, I've seen vary by 20+%.

      • pythux 1895 days ago
        Benchmarking is always a hard problem - no such thing as spherical chickens in a vacuum. That said we'd love to at least standardize the setup for other people running the benchmarks; that is why we opened all the code and data, as a starting point.

        For the study, measurements were run with one of our personal laptops (an X1 Carbon from 2016 with an i7 U6600 CPU and 16 GB of Ram, which is indeed a pretty powerful machine). We tried very hard to limit the impact of frequency throttling due to limited thermal dissipation of the device for the long-running benchmarks. In fact, for the measurements we put the laptop outside at 0 degrees Celsius, and we could observe that the CPU temperature did not go beyond 60 degrees (which is pretty low).

        Do you have any suggestions on how we could improve this setup? We welcome all contributions.

  • Fnoord 1895 days ago
    I don't see Firefox mentioned once in this whole article, so I assume it isn't meant for the sole browser I am using. Therefore this performance study is useless for me.
  • jachee 1895 days ago
    Faster still is DNS-based, network-wide blocking with pi hole.
    • darkpuma 1895 days ago
      If that works for you, that's great. But it's a pretty crude tool. Kind of like wood carving with a chainsaw; some people swear by it and it's undeniably efficient, but other tools give you more delicate control. I wouldn't go without ublock origin's cosmetic filters which I use to block many things that aren't ads (such as those annoying floating bars at the top and bottom of the screen a lot of websites use just to annoy users (sorry, "improve conversion") and waste vertical screen space.) uMatrix (and to a lesser degree, advanced mode in ublock origin) also lets you differentiate between blocking on first party or 3rd party. For instance I block youtube on every website except youtube.com
  • ajobforme 1895 days ago
    why is adblock not included?
  • Dahoon 1894 days ago
    Anyone notice who posted this? Ghostery did.

    Look at that sub history..

    https://news.ycombinator.com/submitted?id=kkm

    They should be shamed. I'll make sure to tell anyone I see mention them to stay clear.

  • g45y45 1896 days ago
    Google, the worlds largest advertising company, abuse market share to crush those that would stand in their way. I stopped using Chrome when this was proposed, and you should too!

    Article provides evidence that Google's pretext of performance doesn't hold water.

  • MrCapybara 1896 days ago
    I'm actually surprised that Brave's Adblocker was not the fastest, considering that they have native (as in, their own browser) support.
  • bradknowles 1895 days ago
    So, an ad for Ghostery?

    Was there anything else?

    • sp332 1895 days ago
      The article isn't really about Ghostery, so... yeah. The article makes an actual point. Go read it.
      • bradknowles 1895 days ago
        I did read the article.

        It read to me like a thinly veiled ad for ghostery.

        If that wasn’t the intent, then I would encourage the authors to be more explicit and detailed about their testing methods and how they made a point of trying to level the field as much as possible, even though their own code is one of the tools being tested.

        • Dylan16807 1895 days ago
          The point of the article is that they're all fast (except DDG). It doesn't matter if the field is perfectly level, it matters that all these extensions should be able to keep working programmatically.
  • bronlund 1895 days ago
    This has to be one of the stupidest performance reviews I have ever seen.