This morning, I saw a simple but nice addition to the ever-expanding scope of Google Webmaster Tools. This is the “Labs” facility that features two functions: Fetch as Googlebot and Malware Details.
Fetch as Googlebot
This tool allows you to simulate what a page looks like in the eyes of Googlebot. Sure we have seen other tools that do the same thing (and more) but just because this tool comes from Google itself emulating its search engine robot, that makes it more credible and reliable. Just enter the URL you’d like to verify (only limited on the website you manage through Google Webmaster Tool). The output is a source code which resembles as when you do a “View Source” on your browser. There you’ll be able to find out whether there are enough links to facilitate Googlebot to reach other pages, or whether a piece of code is visible in the first place. This helps us identify possible problems with content navigation and provides us with hint on how to fix them.
This feature provides detail on any malware activity haunting your website. This is good to find out maybe how many pages are infected with malware and that we’ll be able to pinpoint which pages we should go to and remove offending pieces of source code. Such information can also be a gauge to check our machines whether they have become zombies operating for a third party hacker. Or if our web host’s security measures were simply not enough to deter these malware intruders. I checked this feature on one of the sites with malware problems and so far it didn’t provide extra detail. So it’s safe to assume (as with any Labs feature of Google) that this is still subject to tweaks and full version are still updates away.
In addition to that, I also noticed that Crawl Errors will now display the full URL instead of a truncated one (separated/abbreviated by a triple-dot glyph or three dots). So there’s no need to figure out the entire URL by hovering above the link or simply visiting the problematic URL.