Google follows links and sitemaps provided in search console to crawl websites.
Pages are revisited to check if they have changed
The crawlers needs to prioriterise what and when to crawl different pages, and different domains/websites.
Successfully crawled pages are passed to the index
Google computes crawl rate of your website – according to what traffic your site can handle.
Crawl Stats Report
Provides stats about Googles crawling of your site.
Looks out for errors and drops & spikes in graphs.
Login to Search Console and click on settings
THIS REPORT IS ONLY AVAILABLE FOR PROPERTIES AT THE DOMAIN LEVEL – that don’t include https etc in the address
Look for spikes and drops in the data
E.g. to determine if robots.txt has been updated incorrectly – if there’s a sudden drop.
That’s about it.
Oh, check your server/host is ‘working okay’ too –
Find the above report by clicking “settings” on the left hand side-bar/menu > Crawl Stats > Choose a “host” in the middle panel (usually just your homepage address) > then “Host status”.
- It’s good to check crawl stats and average response time before and after a migration or redesign.
Average Response Time:
Under 200ms: Excellent. Site is well optimized for crawling by Google etc.
200ms-500ms: Acceptable. That’ll do pig, that’ll do.
500ms-1s: Slightly too slow, especially if your site is 1,000 pages+
Over 1 second: Too slow. You’re shit