Leveraging Webmaster tools for SEO Success

Google webmaster tools panel has been around for quite some time. The Live webmaster tools joined a bit later. I get a lot of questions from colleagues and clients whether these are really helpful, and lately I found myself spending more and more time within these places finding useful information and amending my SEO strategy accordingly.

Webmaster tools 101

So why do we need it and what’s in it for me? Well, in general, the webmaster tools panel is the only legitimate and official feedback platform which the engines supply us. That on its own, is more than enough for me to say “hey, let’s take a look at that these guys have to say for themselves”.

the very basic features are:

* One place to submit my standard XML site maps directly to the engines
* Feedback on crawling errors and problems
* Feedback on duplicate content issues (Google only)
* Feedback and settings for regional and domain name issues
* Control over “sitelinks” feature (Google only)

Leveraging Webmaster tools for SEO Success

While these tool are mostly ‘nice to have’, most are not very applicable. in the following clause I will give a few pointers as to how webmaster tools research became my primary place to look for root cause of indexing and ranking issues and how they can be solved using a few examples.

* Feed the crawlers with your content
Create XML site maps, use proper formatting and put only relevant content. make sure your sitemaps update automatically so Google doesn’t have to crawl the same content again and again.

* Find & Fix Broken links: lists of broken links from the webmaster tools panel will help us fix these broken links which can easily disturb our users. Moreover, they cause the crawlers extra work which ends up with bad results, resulting in reduced indexing and credibility for the entire web site.

* Find Server Problems
the “crawl stats” panel shows last 90 of google’s crawling history on your site: How many pages were downloaded every day, how much data was downloaded and the bandwidth it took. optimizing server performance and code standards can improve these parameters significantly.

* Fix Duplicate Content issues
Google webmaster tools reports duplicate descriptions and titles under the diagnostics > content area. Google shows samples of duplications they found including the actual duplicate links. All you need to do is go back to the programmers and fix the problems and you have more spider food.

* Improve crawling Efficiency
In the past, it was a well known habit to bombard Google with as much content and pages possible. This was, to say the least a bad concept. While we do want Google to take as much content from our website, havin them index our “terms of use” over and over again is really a waste of resources, and based on the assumption that every website received a certain data and bandwidth quote from the crawlers, we want them to maximize the effectiveness of their visit.
Therefore the proper use of robotx.txt directive and nofollow links (internal too) may help to guide Google to our core content and the important pages of the web site.

* Remove unwanted content
using the manual remove you can remove certain pages and even complete websites from the google index fast and at ease. The requirement is that the pages either have a block directive in the robots.txt file, noindex meta tag or to return a 404 – then you can simply enter pages from yur site you want removed an they will be removed almost immediately.
This is good also to improve the crawling and ranking efficiency.

* Find bad outgoing links
A new feature from the Live webmaster tools panel enables reporting of pages suspected as malware as well as outgoing links pointing at malware pages. This is really cool and important and may solve a lot of problems.

* Pump up your link development strategy
Both engines show you exactly which pages are linking to your pages from within your domain and from outside your domain. Analyzing that list can help a lot to understand how engines see your incoming links, where you need to improve and how to do it.

* Improve your regional rankings
Google webmaster tools and Live webmaster tools both offer the option to set your geo-target which may contradict your actual server or IP address location. Moreover, Live webmaster tools offers a very coo feature which presents the nciming links to the domain with their regional setting therefore you can easily see where most your links come from and optimize your link strategy accordingly.

2 thoughts on “Leveraging Webmaster tools for SEO Success”

Leave a Reply