Google has given us the gift of Google Webmaster Tools. In one convenient location you have access to great tools. In Google’s own words, Google Webmaster Tools is:
“Statistics, diagnostics and management of Google's indexing of your website, including Sitemap submission and reporting.”
Verification: Before you get started, you need to tell Google what sites you want included in your account. Enter the URL as prompted and then you will need to verify your ownership/control of that site.
A peek inside Google Webmaster Tools:
Diagnostics: The Diagnostic tools are here to tell you about any errors that Google has encountered while crawling your site. They will report on the following error types:
- HTTP errors
- Not found (404)
- URLs not followed
- URLs restricted by robots.txt
- URLs timed out
- Unreachable URLs
Some Statistics:
Top Search Queries: This report shows you how people are getting to your site from a Google Search.
What Googlebot Sees: This is a great way to learn how others link to you and how those links are considered along with your on page content. Multiple sections exist on this report:
- Phrases used in external links to your site (anchor text)
- Keywords in your site’s content
- In inbound links
- The actual content on your site (ordered by density).
Index Stats: The index stats are a shortcut to advanced Google queries on your site using operators. Shortcut links are provided on the following operators…
- site:
- link:
- cache:
- info:
- related:
Links: The link reports in Webmaster Tools are limited, but do provide you with ways to measure internal and external link popularity.
Google Sitemaps: Google Sitemaps are what the entire Webmaster Tools were originally built around. Here you can upload and manage XML based sitemap files that catalog all of the pages on your site.
Analyze robots.txt: Robots.txt is where Googlebot and other spiders go when they land on your site to immediately find instructions on what they can and cannot have access to within your site. If you don’t want spiders indexing your images, just disallow them. If you’d prefer not to have certain areas of your site indexed and available for the searching public – go ahead and restrict access. This is where you can check to make sure your robots.txt file is not only up to date, but also valid in terms of how it is written.
Set crawl rate: This area is very informative - it provides an overview of Googlebot’s activity on your site. If you have recently updated your site or acquired new links you’ll want to come back and check this section out. See if you see an increase in Googlebot activity in response to your work.
Set preferred domain: Tired of seeing www.domain.com and domain.com in your search results? Or, maybe you have become worried about canonicalization and how it will impact your optimization and links? All you have to do is set the preferred domain tool. Using this tool you can instruct Google to display URLs according to your preference. Now you can have all listings appear as being on www.domain.com
Of course, if you’re not worried about this – you can also opt to have no associations set at all.
Remove URLs: This automated tool is available to help resolve issues with pages that no longer exist or pages that you just want removed from Google’s index. In conclusion, the above is a general outline of the important features in Webmaster Tools.
As you use these tools more you’ll find a number of new ways to use the data. Remember, knowledge is power and Google is sharing some great info about your site.
https://www.google.com/webmasters/tools/
Tip!Make sure you are aware of what information Google makes available for download. Make sure you schedule a download every 60 days or so. You see, Google is only providing information for 90 day periods.