Noindex Directive in Robots.txt Is No Longer Supported by Google

Google has officially announced that GoogleBot will no longer support a Robots.txt directive for indexing. That is for example noindex directive.

For Google, the noindex robots.txt directive is not an official directive. Hence, it will no longer be supported.

Therefore, to cut to the chase, if your site or sites are relying on robots.txt as a noindex directive —- then you will have until September 1, 2019. Make the switch by following Google’s recommended alternatives:

  • Use Noindex in robots meta tags instead. It is said to be the most effective method to remove URLs from the index when crawling is allowed.
  • Use the 404 and 410 HTTP status codes to drop the URLs from Google’s index once these URLs are crawled and processed.
  • Have password protected URLs as they are normally removed in Google’s index (unless a markup is used to indicate otherwise).

See more options at https://webmasters.googleblog.com/2019/07/a-note-on-unsupported-rules-in-robotstxt.html

Leave a comment

Your email address will not be published. Required fields are marked *