4 comments

  • freeone3000 1246 days ago
    Can you imagine if they simply had not admitted the error? It was a year, and nobody noticed! Nobody seems to be able to tell is how much this cost them, how much spend was wasted, whether they should have spent less, and they're not leaving Facebook over it. So what does this actually influence?
    • some1else 1246 days ago
      I have seen companies drastically reduce their spend on Facebook in the B2B SaaS space. The ones that are left are in the process of rebounding to the same volume as Q1, but something like 60% of advertisers went from tens of ads to 0 campaigns. A lot of it is likely due to the effects of the pandemic, choosing to focus on channels with higher signal & better returns.
  • disgruntledphd2 1246 days ago
    O. M. G.

    So, the thing here is that FB's tool for helping advertisers measure the effectiveness of their ads was producing falsely positive results for an entire year, and nobody noticed.

    This is a shocking indictment of multiple teams and their code quality and QA, and is going to really, really damage their advertiser relationships for many years to come.

    The reason that this matters is that the only real way to measure effectiveness of different strategies on FB is using this tool, as FB can actually match users based on their conversion probability, allowing for valid tests.

    • suifbwish 1246 days ago
      The only effective way to gauge the effectiveness of running an ad campaign on a platform is to have three identical products marketed under different names, each of equal repute then run ads for one on the platform you want t o gauge, run ads for the second on a competing ad platform, and finally a control product which has no ads run on any platform. Believe it or not but it’s simple tests like these that keep us from slipping back into the dark ages.
      • disgruntledphd2 1246 days ago
        So, I spent many years doing this professionally (and still moonlight in this area).

        > each of equal repute

        This seems incredibly hard, how would you accomplish this?

        So, the whole point of the conversion lift tool is to track all conversions associated with your ad (i.e. post view or click on the platform), while having a rigorously balanced control group.

        This allows you to estimate the incremental impact of your ads, which is what you want in order to make budgeting decisions.

        Your proposed design doesn't appear to accomplish any of this, can you clarify what you are attempting to accomplish with this design?

        So, for my money, I would have the following concerns around this design: 1) You're actually assuming that the audiences reached by platform A and B are identical

        2) You're assuming that Platform A and Platform B are identically effective at showing your ads to people

        From my experience, neither of these things are true (unless you run untargeted CPM ads, which seems unlikely in a digital world). Can you explain how this works?

  • bzb6 1246 days ago
    It’s not like they have anywhere else to go to.
  • bamboleo 1246 days ago
    Feels like this confidence can be shaken more than Shakira’s behind.