DeepCrawl has upped its price a bit since we first looked at it. The tool now begins at $89 per month, billed month-to-month; $81.50 when billed annually). That is about $10 more than it cost previously, so it should not be a problem for most search engine optimization (SEO) tool customers. What might throw you, however, is its design focus: this tool is meant for one thing and one thing only: inside out, top-to-bottom website crawling. SEO tools can generally be grouped into three main categories. First, there is ad hoc keyword research, which helps you find the most opportune search pages for your content to rank higher. Next, there is ongoing position monitoring, which is how you keep track of your search positions to identify where you are gaining and losing ground. Finally, there's crawling, in which web bots parse either a single website or systematically crawl and index the entire internet. DeepCrawl does not do internet-wide crawls but it will give your website about as comprehensive an SEO "full body scan" as you can get.
This kind of deep site crawling is useful for businesses in a number of ways. DeepCrawl is a tool for both novice users looking to highlight site issues or advanced users customizing their crawls, but without any restrictions on data access. Crawling also makes monitoring site health easier with both real-time and historical context, giving businesses the ability to compare crawls year-over-year and drill down into various trends and report types. You generally need to use a combination of multiple tools across the three categories of SEO for the most effective strategy, and site crawlers are how you identify the strengths and weaknesses of your own website and landing pages.
Editors' Choice SEO tools (79.00 Per Month, Billed Annually at Moz), (33.00 Per Month, Billed Annually at SpyFu), and have a far greater depth of keyword and position monitoring, but do not have nearly the crawling and site audit capability. So, for example, if you used one of the Editors' Choice tools or a keyword-specific product such as KWFinder.com to identify a target search engine results page (SERP) and spot ideal for your content to rank, you would then run a DeepCrawl scan of your site's architecture to generate page breakdowns and identify the specific SEO issues your site needs to improve and resolve for that page to rank. The same goes for figuring out why your content is losing ground in search rankings as tracked in a monitoring tool such as AWR Cloud (49.00 Per Month at Advanced Web Ranking) or conducting some root cause analysis on why your site was hit with a Google search penalty and how you can recover. Eventually, an SEO strategy always comes back to auditing your own site. That is DeepCrawl's one and only job.
As stated, DeepCrawl starts at $81.50 per month, billed annually ($89 month-to-month) for its Starter plan, which comes with 100,000 active URLs, 5 active projects, plus full application programming interface (API) access. Most SEO tools reserve API access for enterprise tiers, which is a definite plus for DeepCrawl users who want deeper customization without the enterprise price tag.
Another change that has happened since we last reviewed the tool is that the company has done away with its other pricing plans. For those wanting to sign up online for the service, there is the starter plan and that is it. More advanced features and increased service options are now all available in only one other plan type: The Corporate Plan. There is not much detail here, aside from "Bespoke" pricing, which we assume means a call to a company representative so they can tailor a service and pricing plan to your needs.
The site crawling process in DeepCrawl begins with a quick four-step setup. In any one crawl, you can include up to five sources, sitemaps, websites, analytics, backlinks, and URL lists. The idea is to make it easy to identify gaps in your site architecture, with specific metrics you will not find in other crawling tools such as Orphaned Pages Driving Traffic. DeepCrawl is designed to give users a complete overview of their website's technical health, with actionable data and insights to increase SEO visibility and turn organic traffic into revenue.
When you sign up for a free DeepCrawl trial, the interface first takes you to the project dashboard. To set up ours project, we entered PCMag.com as the primary domain. Step 2 is to pick data sources for the crawl, which include the website itself, PCMag's sitemaps, and analytics you can access with an active Google Analytics account. You can also add specific backlink and URL target parameters within the domain (for instance, if you have identified target pages in a keyword research tool) by uploading CSV files of specific backlinks or URL lists. There is also a pop-up box at the bottom of the page to watch a video tutorial for how to set up DeepCrawl projects and crawls.
Step three lets you set up the parameters for the crawl itself. We was able to toggle the crawl speed to anywhere from one URL to 50 URLs per second. From there, we had the option to set how many "crawl levels" deep the scan should go from the PCMag homepage, and set the maximum number of URLs at which the crawl should stop. Ours free trial was for the Starter plan, so my URLs were capped at 100,000. The final step is to set whether this will be a one-time or a recurring crawl, which can be hourly, daily, weekly, bi-weekly, monthly, or quarterly, with the option to set start and end times for the crawl.
This is an important feature to set if you need regular site audits but be careful with your URL limits depending on which plan you choose. There are also more advanced settings for deeper crawl restrictions, excluded URLs, URL rewriting, and API callbacks, but a non-technical SEO user will not necessarily need to get into those. From there, we clicked Start Crawl, and received an email a few minutes later when the crawl was completed.
In the completed crawl dashboard, you are immediately faced with a wall of information. If you know what you are looking for, then there are dual search bars at the top to find a specific URL or find the report on a particular area of the site architecture; this could be anything from body content and social tags to failed URLs and website redirects. In all, DeepCrawl ran 175 reports on my URLs-which in the free trial turned out to be capped at a little over 10,000-and discovered 30 site issues and crawled six "levels" or links deep from the homepage.
The main dashboard page provides a list breakdown of all the current site issues, from which you can drill down to the specific URLs where DeepCrawl will highlight page errors such as duplicate content, broken pages, excess links, or pages with titles, description, and metadata in need of SEO improvement. Next to that list, we found an interactive pie chart page breakdown of the most prevalent issues amongst the 10,000+ crawled pages.
DeepCrawl has also updated its user interface (UI) since our initial review, adding breadcrumbs across each page to make it easier to navigate to other parts of the platform and adding a dashboard section to assess all running crawls in one place. For ours domain, we found that, while there were no major problems with 49 percent of my pages (most of them "primary" pages), 30 percent pages below the surface were dealing with 5xx server errors, five percent of pages have failed URLs, and 16 percent of pages were "non-indexable." According to DeepCrawl, the 5xx errors likely occured due to the PCMag site blocking aspects of the crawl or due to crawling too fast for the server to handle. DeepCrawl has also clarified that pages flagged as primary are not necessarily problem-free, and can still be reviewed to include missing pages, exclude low value pages, and annotated with on-page SEO recommendations.
Some of the other crawling tools we tested, including Ahrefs (82.00 Per Month, Billed Annually at FS.com) and (49.99 Per Month, Billed Quarterly at Majestic.com) , also give you this kind of breakdown, including basic site issues as well as backlink indexing, meaning the incoming hyperlinks from other sites to yours. What the others do not quite do is delve deeper to the point that DeepCrawl does, particularly with breakdowns like the pages that are secured with the more secure HTTPS protocol as opposed to HTTP, something the Google algorithm takes into account when ranking pages.
DeepCrawl also gives you intelligent Page Groupings (which you can find in Advanced Settings when setting up a crawl), meaning your sampling is based on a percentage of pages rather than a number. Grouping pages, this way is important in that it gives you a consistent sampling across crawls. Think of it like a scientific experiment: if you are an e-commerce website crawl 20 percent of your product pages, in the following crawl, Deepcrawl will scan the same 20 percent for pages that have been added, removed, missing or changed within each report. This also reduces crawl time and cost, as you are targeting your crawl to the specific subset of pages in which you are interested.
Similarly, DeepCrawl's Custom Extractions can be used to include or exclude the parameters of your choosing. This is a more advanced feature designed for more specialized crawl to hone in on your focus areas. The company has also added Preset Custom Extractions for non-technical users, as writing custom extractions requires knowledge of regular expression language (Regex).
Aside from the GSC integration, the most prominent enhancements to DeepCrawl are in its reporting capabilities, for which DeepCrawl has added 75 new reports in the past year. Beyond the basics-sharing and exporting reports, or adding a task for that report-you can click the Share button to get a shareable link to that report. You can then email the report to any number of recipients. You can also download the report as a CSV file or an XML sitemap, and customize reports with custom branding.
The task-based feature is particularly useful. DeepCrawl allowed us to add a task for specific reports-in this case, the list of 5xx server errors-to take action on that SEO vulnerability. DeepCrawl allows you to not only schedule and customize crawls, but also using the Task Manager, you can track progress, manage issue, and deadline workflows. The mark of a good tool is one that not only discovers an issue or an opportunity but also helps you to act on it with targeted recommendations. We was given the option to set an open or fixed task; designate a low, medium, high, or critical priority; set a deadline, and assign that task to specific individuals on our team. DeepCrawl gave us a tangible path to resolving SEO issues with ours site. Only the targeted recommendations of SpyFu and KWFinder.com provided the same ability to act on SEO reporting, and none of the other crawlers included this kind of task-based action feature.
DeepCrawl has improved the left navigation search bar to find reports using keywords as well (even if the keywords do not appear in reports actual title), and filter reports by any relevant URL or link metric. There are also a number of new data visualizations on the report and category screens including breakdown graphs, related graphs, new trend graphs, and UX improvements such as graph scrolling and toggling, as well as interactive explanations on specific metrics appearing in a given graph. The reports dashboard UI is flexible as well, meaning you can drag-and-drop the reports you want to display.
Clicking on any URL in any report gives you detailed metrics per page. In all its URL-by-URL reports, DeepCrawl uses a custom metric called DeepRank, which measures the "internal weight" of a link calculated similarly to Google's PageRank algorithm. It is DeepCrawl's URL authority metric, on a scale of 0-10, showing you are most important URLs or those in need of the most improvement. Therefore, while the PCMag homepage and top pages were all ranked in the 8-10 range, some issue-laden pages were left with a DeepRank of close to zero.
In addition to synchronizing with Google Analytics, DeepCrawl also includes a section of data breaking down desktop and mobile pages for not only responsive web design, but also all mobile configurations including separate mobile and desktop pages, dynamic pages, and AMP pages. These types of key mobile breakdowns can also be found in SpyFu and (99.95 Per Month at SEMrush) , but not to the depth of DeepCrawl's metrics and not in any of the other crawler and backlink tracking tools I tested, including LinkResearchTools (329.00 Per Month, Billed Annually at LinkResearchTools) .
The biggest capability update to DeepCrawl since our initial review is an integration with Google Search Console (GSC), and even more advanced page-by-page user experience (UX) and site performance metrics. The integration allows you to connect DeepCrawl's existing site performance insights with organic search information from GSC's Search Analytics report. By adding your Search Console property as a URL source in your crawl settings, DeepCrawl is now able to provide impression, click, click through rate (CTR), and average position metrics for every indexed page appearing in search results. Google's Search Analytics report comes with a 1,000 URL limit in the GSC interface, but accessing the report through DeepCrawl gets you around it.
Other SEO tools such as SEMrush and Majestic also integrate with Google Search Console, but DeepCrawl's integration is the only one among the tools we have tested to use this to provide device-specific SERP metrics. Through the integration, DeepCrawl has released 10 new reports, two new graphs, and deeper desktop/mobile/tablet comparisons. The device breakdowns and comparisons now pull in GSC data on a country-by-country basis, search impressions and effectiveness for indexable and non-indexable pages, crawl pages getting traffic from image search, and metrics on Google AMP pages.
One unique feature here is a measure of mobile/AMP pages receiving desktop traffic and vice versa, meaning DeepCrawl will show if your pages are ranking on the wrong device. This means DeepCrawl crawls all separate mobile or AMP URLs to highlight differences and mismatches between desktop and mobile content. Reports check that high-value pages all exist on mobile.
DeepCrawl also provides data on social tagging, such as the pages with valid Twitter Cards, as well as a tab simply showing page performance in terms of load time or the time it took to "fetch" a page. Finally, the tool provides website migration reports to analyze live and staging websites during a migration and report specifically on HTTP and HTTPs pages. These are metrics you will find in website monitoring tools, but in terms of SEO, it can be valuable simply in identifying pages where the UX is poor and ensuring smooth website migrations.
DeepCrawl is the best crawling tool we tested by a wide margin. It provides the greatest depth of domain scanning and SEO reporting with extremely granular data and metrics that, for an SEO novice, can honestly be overwhelming. Much of the site architecture mapping and domain analysis it provides is arguably better suited for developers and IT. DeepCrawl is also testing out an experimental Site Explorer Mode allowing you to visually explore site architecture.
As with KWFinder.com's dominance in ad hoc keyword research, DeepCrawl's laser focus on crawling is both its blessing and its curse. This narrow scope of its functionality precludes DeepCrawl from earning an Editors' Choice nod alongside more full-fledged SEO platforms Moz Pro and SpyFu. However, in terms of standalone crawling capabilities to scan your website or your competitors' sites from top to bottom, DeepCrawl is a high-powered SEO microscope.
PROS
The most granular and comprehensive website crawling tool we tested.
On-page SEO recommendations.
Responsive modern interface.
Google Analytics and Google Search Console integration.
Backlink tracking.
AMP metrics.
Desktop/mobile/tablet breakdowns.
CONS
Site crawling is all it does.
No keyword research, position monitoring, or web-wide indexing features.
Depth of crawled data can be overwhelming if you don't know what you're looking for.
BOTTOM LINE
DeepCrawl is a top-to-bottom site crawler, and it does this job well. However, a lack of any other kind of SEO capability will keep marketers looking for all-around tool sets looking for other solutions.