I recently began using DeepCrawl after putting it off for awhile. I quickly saw the advantage of not only being able to crawl much larger sites (since desktop applications begin to use up a lot of memory once your crawl exceeds a certain number of URLs). I was happy with my decision to begin using DeepCrawl and I quickly found some opportunities for on-site improvements I had previously missed. I quickly became a believer. Aside from the fact that Deep Crawl can crawl a larger number of URLs, it also integrates with Google Analytics and Google Search Console. This allows the tool to give you far deeper insights than a typical site crawler.
Google Analytics makes our lives easier. Easy installation, intuitive (usually) reporting/metrics, easy integration with Adwords and Webmaster Tools, etc. Best of all, it’s free. Of course, installation is also easy to botch, and future installations don’t always correctly overwrite the old ones . You may have failed to add the tracking code to certain pages. Or perhaps you have multiple implementations on certain pages. Maybe certain pages have your old Google Analytics tracking code. Perhaps you have 2,3 or even 50 websites sharing a code base and you ended up getting your wires crossed. This can really skew your metrics in numerous ways.
Enter A1 Website Analyzer
I have been using this nifty tool for a lot of on-site optimization lately. You can use it to crawl your site and find broken links and redirects, link juice flow, last date modified, review meta and H tags, etc. It’s also useful for checking for instances of specific code implementation. Out of the box, A1 WSA can check for gat and gaq object methods for Google Analytics tracking code as well as Google Adsense tracking. This is useful for searching for pages that lack this code (or pages where the code has been been implemented more than one).