Why Google Webmaster Tools’ “Crawl Errors” is a Misnomer

Google Webmaster Tools recently provided us with new features while removing one often-discussed one.

Often we discuss about new features, deprecated ones and perhaps change in layout. What about the terms used? It’s very important that the message conveyed by Google Webmaster Tools is accurate should not provide a hint of confusion.

I think that the Google Webmaster Tools’ “Crawl Errors” is not an appropriate term to use. The reason is simple, not all the reports detailed in this section are really errors.


To me, “Restricted by robots.txt” is not an error, provided that a webmaster made a deliberate effort to block certain pages within the site from being crawled by search engines. When I restrict all search results from being being crawled through placement of

Disallow: /search

I shouldn’t consider as part of “Crawl Errors”. My suggested replacement to “Crawl Errors” is “Crawl Issues” which a webmaster’s attention may be drawn for possible errors such as inadvertently blocking content from search engine crawls.