10 comments

  • dmje 1045 days ago
    Really nice. I'm definitely your market (micro digital agency owner) - but feedback on pricing: I think it's the wrong model.

    I'd prefer to pay per project rather than per month. I think this makes more sense at the lower end / for the "freelance" tier.

    Say I do 5 web projects a year. Each one is reasonable budget, and mapping is a core part. From a psychology pov I'd much rather pay (say) $200 one off for each project (maybe based on a cost per page) fee than a rolling per month cost on a thing I'm only using intermittently.

    Money wise, my response is silly, I know, but I think the psychology at play here is important. As a small company I don't like to spend money monthly on a thing I don't use whereas when client has just paid me a big deposit for a new gig, I'm happy to pay.

    For larger tiers, cool. But a per page "once off" would make sense to me.

    • owor 1045 days ago
      Thanks for your feedback dmje.

      We've had similar discussions with customers in the past - we've found the vast majority (90%) are happy to pay monthly, and the remainder are very vocally against.

      We are looking to accomodate both sides, and the current plan is to offer a subscription 'pause', where all of your existing data is stored for free, and you can unpause to edit/create new projects.

      It's still being worked out, but hopefully this will be a happy medium ground in the future.

      • methyl 1045 days ago
        If I may give you an advice.

        If you want to justify monthly payments, figure out how to provide ongoing value. In your case, it could be that you will update sitemaps in the background and notify users about changes. That way, it’s much easier to see the reason behind recurring payments.

      • dmje 1045 days ago
        Hey - thanks - I reckon a hybrid would work well. So the tiers for those that are happy with a regular payment, or a "pay per page" for those like me who are maybe more project based. Will be interesting to see what you do!
      • Mauricebranagh 1045 days ago
        The limit of 5000 might be a bit low for larger sites is this pages or URLS?

        I am thinking ecomerce and classified ads sites

        • owor 1045 days ago
          We're currently working on scaling up to 25,000+ pages, hopefully sometime in the next 12 months!
    • strongbond 1045 days ago
      I agree. I'm turned off by the way everything is a subscription nowadays, even if you only use a service a discrete and specific number of times. Not your fault I know :-) everyone is doing it! But there are only so many annuities I'm happy to contribute to. I get the feeling that we're all being nickel and dimed to death right now.

      It looks very nice, by the way.

  • Mike_Jordan 1045 days ago
    Is there a free/open source version of same or similar product?
    • robtherobber 1045 days ago
      In addition to Screaming Frog, which is very well-known in the industry, you may also be interested din Sitebulb, which seems to have been inspired by Screaming Frog but took a slightly different approach to the presentation of the audit information.
    • owor 1045 days ago
      Yep - Screaming Frog (desktop tool) has a limit of 500 pages in the free version, which is probably enough for most smaller-sized websites.

      In terms of web-based tools, VisualSitemaps.com also has a freemium option with a 50 page limit.

    • adreamingsoul 1045 days ago
      Screamingfrogseospider?
      • 135792468 1044 days ago
        Yes it has a mapping tool
  • huhtenberg 1045 days ago
    > We're having trouble crawling that website. Please try again...

    It'd be helpful to know what the trouble was exactly. Some troubles are non-retryable.

    • owor 1045 days ago
      Noted, we'll look into returning more descriptive messaging for common errors.
      • huhtenberg 1045 days ago
        It'd also useful to have an option to generate a full sitemap once, for a fixed fee, sans subscription.
  • ab_testing 1045 days ago
    Pretty nice. Do this a million times, and you got yourself a search engine going .
    • owor 1045 days ago
      Watch out, Google!
  • Tepix 1045 days ago
    Going to the website i don't see where i can "generate a free visual sitemap by crawling any website". Is it just me?

    The pricing page doesn't list any free options.

    I can create an empty sitemap but how do I crawl a website then?

    • owor 1045 days ago
      Hey Tepix - head to https://rarchy.com/sitemaps/visual-sitemap-generator and enter the website you wish to crawl in the left-hand side input.

      This will crawl the site & generate a visual sitemap without requiring you to signup for an account.

      • Tepix 1045 days ago
        I see the issue now, with uBlock Origin active the input field is invisible.

        I noticed another issue if you enter "domain.com" into the input field but the website redirects only the starting page to "www.domain.com" the links in the map may not work because they lack the "www." prefix

  • jtwaleson 1045 days ago
    Cool. Not sure if you use the sitemap to fetch by default, but if you don't then I'm really impressed by the speed! Some feedback: the prices on the pricing page look bad for me on mobile (android, chrome).
    • gildas 1045 days ago
      This is something you can do with SEO4Ajax [0] (shameless plug).

      [0] https://www.seo4ajax.com/

    • owor 1045 days ago
      We do offer an XML sitemap importer if you're in a hurry, but the free generator is always from a fresh crawl :)

      Thanks for letting us know about the pricing page on mobile!

  • saimiam 1045 days ago
    I have a react app website which seems to break your crawler. A future enhancement may be to run JavaScript inside the crawler to generate the site.
    • owor 1045 days ago
      Thanks for the feedback.

      We have looked into developing the crawler for SPAs, but haven't managed to create a robust enough crawler just yet.. we'll get there one day!

  • js4ever 1045 days ago
    you should rate limit this, or it's a DDOS as a service ...
    • owor 1045 days ago
      We use an auto-throttling algorithm for the crawler to ensure we are a 'polite' crawler!

      We also send a unique user-agent with every request, so any host can easily block us if required - luckily, this has never happened so far.

  • ttty2 1045 days ago
    What's the use case?