Sitemaps
A sitemap is a list of all of the live URLs which exist on a site and is used to inform search engine crawlers of the most important pages and therefore which ones should be crawled and indexed.
There are several things to consider when creating sitemaps, as well as understanding how search engines view them. We cover a range of these topics within our SEO Office Hours Notes below, along with best practice recommendations and Google’s advice on sitemaps.
For more on sitemaps and SEO, check out our article: How to Improve Website Crawlability with Sitemaps.
Sitemaps Submitted Through GSC Will be Remembered for Longer
Google’s memory for sitemaps is longer for those submitted through Google Search Console. Sitemaps that are submitted through robots.txt or are pinged anonymously are forgotten once they are removed from the robots.txt, or if they haven’t been pinged for a while.
Sitemap Files Returning 404s Don’t Cause Issues for Google
Sitemap files that return 404s don’t cause any issues for Google from an SEO perspective, they will just be left as 404s.
Sitemaps Are More Critical for Larger Sites with High Churn of Content
Sitemaps are more useful for larger websites that have a lot of new and changing content. It is still best practice to have sitemaps for smaller sites that largely have the same content, but they are less critical for search engines to find new pages.
Static Sitemap Filenames Are Recommended
John recommends having static site map filenames that don’t change every time they are generated so they don’t waste time crawling sitemaps URLs which don’t exist any more.
Make Sure There is a Clear Connection Between Your Mobile & Desktop Sites
It’s possible to include m. pages in your main sitemap file to help Google discover and crawl them for mobile-first, but if there is a clear connection between the desktop and mobile sites then this won’t be necessary.
Canonicals Are Chosen by Google Using XML Sitemap URLs
XML sitemap URLs are used to help inform Google’s decision on which URL is chosen to be the canonical.
Crawl Frequency Attribute in XML Sitemaps Doesn’t Impact Crawl Rate
Google takes no notice of the crawl frequency attribute in XML sitemaps or any priority set. Only the last modification timestamp will impact crawl rate.
Google Uses Scheduler to Determine Recrawl Date
Google uses a scheduler before crawling to work out when they need to recrawl URLs. Google will increase crawl rate if it gets signals that it needs to do so e.g. updated modification date in sitemaps and internal linking (especially from the homepage).
New Search Console Will Show More Sitemap Data
The new Search Console will show more detailed information regarding sitemaps and more detail per sitemap file.
Use Several Smaller Sitemaps to Locate Indexing Issues
Having several smaller sitemaps for the different sections of your site is recommended for diagnosing indexing issues.