June 13, 2006
We recently updated our webmaster help center. Two new sections that you may find particularly useful are:
- Using a robots.txt file
- Understanding HTTP status codes
Using a robots.txt file
We've added a new section of help topics in the How Google crawls my site section. These topics include information on:
- How to create a robots.txt file
- Descriptions of each user-agent that Google uses
- How to use pattern matching
- How often we recrawl your robots.txt file (around once a day)
Understanding HTTP status codes
This section explains HTTP status codes that your server might return when we request a page of your site. We display HTTP status codes in several places in Google Sitemaps (such as on the robots.txt analysis page and on the crawl errors page) and some site owners have asked us to provide more information about what these mean.