Google has limited support for its robots.txt files to only four fields and clarified its position on unsupported directives.
Key Points:
- Google only supports three specific robots.txt fields.
- Unsupported directives in robots.txt will be ignored.
- Check your robots.txt files with this update in mind.
In its recent Search Central document, Google clarified its position on unsupported fields in robots.txt files.
Key Update:
Google has said that its crawlers do not support fields not listed in its robots.txt document.
This clarification is part of Google’s efforts to guide website owners and developers.
Google’s statement:
“We occasionally get questions about fields that aren’t listed, and we want to make it clear that they aren’t supported.”
This update will reduce confusion and prevent websites from relying on unsupported directives.
What it means:
1. Use supported fields: Only use fields that are mentioned in Google’s documentation.
2. Review Robots.txt: Audit existing robots.txt files to make sure they don’t contain any unsupported directives.
3. Understand the limitations: Google’s crawlers may not recognize some third-party or custom directives.
Supported fields:
According to the updated documentation, Google officially supports the following fields:
User-agent
Allow
Disallow
Sitemap
Important points:
Although not explicitly mentioned, it is clear that Google does not support common directives such as “crawl-delay,” even though other search engines recognize them.
Image and content credit: searchenginejournal