Saturday 25 October 2014

Google recommends submitting a sitemap and outlines best practices

Google and other search engines can discover all of your pages as long as they're all linked. Google recommends submitting an XML sitemap to optimise this process and provide them with more information. In this recent post on the Google Webmaster Central Blog, Alkis Evlogimenos outlines best practices.

I'm happy to see that the sitemap produced by Scrutiny complies with the key points in this blog post, these being:

  • The format of the xml itself, complying with the standard protocol

  • inclusion of the the fields most important to Google; the url itself and  the last modified date
  • exclusion of urls disallowed by your robots.txt (remember to go to Scrutiny > Preferences > Sitemap > tick 'check for robots.txt')
  • exclusion of duplicate pages marked as such by the canonical tag

Scrutiny manages all of this for you, picking up the last modified date from the server, and assigning priority automatically according to your site structure. You can control the changefreq and priority fields yourself by setting up rules:

After scanning your site and generating the sitemap, you can export the xml sitemap and optionally ftp it to your server.

To automate this process, you can schedule your scan (weekly or monthly) with 'Save sitemap' and 'ftp sitemap' as options when the scan finishes.

Scrutiny has a 30 day free and unrestricted trial. Download the most recent version here, and see a full list of Scrutiny's features here

1 comment:

  1. Now i’m using “XML Sitemap”. It’s very helpful. Thanks admin for give the information about “XML Sitemap“.