“New to doing Search Engine Optimization…. Or a veteran in this field…
You still need guidance from Google at least to the extend of using webmaster tools!”
Now what does webmaster tools from Google offer you? Do they guarantee better placement after using the tool?
But webmaster tool can help you to diagnose errors on your website and improve the web page’s visibility.
Google account (as like any other service that Google gives you) is required to start off using the service. You can enter your website’s primary URL, and then needs to get it verified, either by uploading a Meta tag to the sites header, or by uploading a webpage that Google instructs you to add. This verification is done to confirm that you are the actual webmaster or the owner entrusted SEO who really controls the website.
Once the verification is done, Google opens up all of the webmaster tools services to you.
Diagnostic tools tell you about any errors that Google encountered while checking your website. The ones that are displayed are:
URLs not followed
URLs restricted by robots.txt
URLs timed out
As your website keeps growing each day and gets more web pages added to it, you should check these reports now and then to make sure that no problems exist and that all new content is being spidered.
Top Search Queries
Top Search Queries reports lets you know the search queries in Google that most often returned pages from your site, and which ones were clicked. The report shows you how people are getting to your site from a Google Search.
Crawl stats report
Crawl Stats speaks about the actual crawling stats of Google bots on your website. It also displays the Page Rank values currently attained by each of the pages in your websites that has been indexed by Google. These statistics help you monitor the fruitfulness of you link building campaigns.
With index stats you can do Google queries on your site using the following operators:
Subscriber Stats assist webmasters to see how many feed subscribers they have to their Blogs, RSS feeds etc. It displays the number of users who have subscribed to these feeds using Google products such as iGoogle, Google Reader, or Orkut.
Google allows XML sitemaps of your website to be submitted, so the search engine spiders are informed about all of the links that your websites carries. Here you can upload and manage sitemap files for your website.
Robots.txt is a protocol for search engine spiders to understand areas on your website that they are allowed and not allowed to crawl on your web site.
You can check to make sure your robots.txt file is not only up to date, but also valid in terms of how it is written. A simple error in this file could cause search engines to drop dozens or even hundreds of your sites pages from being indexed.