Crawl Rate
Crawl rate is the number of requests a search engine crawler makes to a website in a day and was introduced to reduce server overload. Due to sophisticated algorithms, Google is able to determine and set an optimal crawl budget for individual sites, this is covered within our SEO Office Hours Notes along with further best practice advice.
Google Checks Robots.txt Once per Day
Most sites have their robots checked once a day, you can speed this up manually by submitting it within the robots.txt tool in Search Console.
Site Traffic Doesn’t Impact Crawl Rate
Traffic volume doesn’t affect crawl frequency, as Google don’t know this. If Google can recognise important pages which frequently contain links to new pages with unique content, they will be crawled more quickly. But crawl rate, or rate of change of a page, doesn’t have any direct relationship to ranking.
Fetch Time Impacts Crawl Rate
Google recommends a page fetch time of less than 1 second, as sites which respond more quickly will get a higher crawl rate. This won’t directly help you rank better but might help get you ranked more quickly.
500 Error Pages May Impact Crawl Rate and Will Eventually be Treated as 404
500 errors can impact ranking. They might result in a lower crawl rate. If they are persistent, they will be treated like a 404 and dropped. Also, Google won’t see the content of a 500 page so you can’t use a meta noindex to get Google to drop those pages if that’s what you want.
Fresh links increase crawling of old pages
If google see’s a new link to an old page, it’s more likely to crawl that page more frequently. So adding new links to re-activated pages should get them discovered more quickly.