SEOcertification.org Blog » Blog Archive » Google Webmaster Central And SEO
With the release of Webmaster Central, Google has made a strong statement that serious website owners need to be concerned with, and aware of a range of data regarding their sites. Crawl errors, backlinks and page load times are all available for a webmaster to study and parse through in order to refine and integrate with their search engine optimization strategy.
Along with Yahoo and Bing, Google provides information at Sitemaps.org regarding how to create an XML file that will feed information to the search engines about a site. Google is committed to assisting webmasters with their sites, even going so far as to enable comments on the official Webmaster Central blog, the only official Google blog that has done so.
Statistics for Unverified Sites:
There are two ways to take advantage of the Webmaster tools – site owners can remain unverified and be given a limited look at their site through Google eyes, or verify and receive a more complete look. For sites that are not verified, there are three tools to learn to love: Site information regarding Sitemap details and errors, basic indexing information about your site, and a page for robots.txt analysis and correction where site owners can test their robots.txt file against several user-agents to ensure search engine comprehension.
Upon adding a site to your Google account, you are given a look at the My Sites page, which shows all the sites represented under your Google account. If you only have one site added, you will be immediately taken to the Diagnostic Tab for that site where you can get information about the last crawl date and the site’s index status (whether it is included in the index or not). From this page, you can also test your robots.txt file and discover if Googlebot is able to spider your site without encountering problems. Improperly written robots.txt files can block the search engines from indexing relevant content.
From the Sitemaps tab on the My Sites page, you can view information about the Sitemaps you have uploaded to Google and view any errors that the spiders have encountered. Information includes the date that the sitemap was last submitted, as well as the date it was last downloaded by Google. You have the ability to delete any Sitemap and resubmit an updated version of a current Sitemap. Pay attention to the section that lists error codes, some of these could hinder the indexing of your site. Remember the goal of the Sitemaps protocol is to get your site indexed–errors in the document will get in the way of that.
When presented with these powerful tools and data sets, many webmasters are simply overwhelmed and don’t know where to begin or how to leverage the information to their advantage. The most asked questions are about the XML feed. What is it? Do you have to have it? Will your site be indexed without it? As with most things Google, the answer is multipart.
The Sitemaps protocol, which is supported by all three major engines, is merely an XML feed designed to help the engines discover and index all the pages of your Web site. The feed gives information about page creation, updates, and importance. If a page is very important to your site and is not indexed, the feed can help in getting that page spidered. The XML feed will not make your rankings better, except in that it might help the search engine find pages supporting your theme that it didn’t previously know about.
The Sitemap isn’t a requirement for being indexed, it just makes it easier. Google states:
“A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. Sites are never penalized for using this service”.
If your site is already being crawled and included in the index, you probably don’t need a Sitemap, but that doesn’t mean that you shouldn’t still be using the webmaster tools to get a better idea of what Google sees when it crawls your site. Each tool is targeted to a different area.
It should be emphasized that even if you do have a Sitemaps file, it does not replace the need for a physical site map on your Web site. Despite the similarity of names, each serves a distinctly different purpose. Google’s Webmaster Guidelines suggest that you include a site map on your page for both users and search engine spiders.
Statistics for Verified Sites:
To get the most value out of these tools, your site must be verified. Viewing your site the way Google sees it is often an enlightening experience and will alert you to errors Google has encountered while crawling your site. The problems most commonly found are often also easy to fix, such as 404 errors that redirect to return 200 Okay pages and confuse the search engine. Using the webmaster console can help you identify these problem areas and facilitate the spidering and indexing of your site.
Query information and site analysis is available, giving you a brief snapshot of what Google finds your site relevant for, as well as how (and if) visitors are finding you. The entire console is very user–friendly, even for the most novice of site owners. Using the Webmaster tools can help your search engine optimization campaign by enabling you to make the most out of the vast information Google stores about your site.
One of the reasons that webmasters love the console is that Google shows link data. Digging through the internal links that Google knows about can assist you in strengthening your site’s architecture, ensuring that every page can be reached by a spiderable text link. Even more exciting is the external link data the tools provide. Want to know how many links to your site Google actually knows about? The tools allow you to get a much better look at your backlinks, though Google Engineer Matt Cutts warns that you shouldn’t assume that every link you see reported counts for your site. Before this feature was included in the console, the only way to get a glimpse at known links was to use Google’s link: search operator but that query was notorious for returning just a sampling of links and no one could say how large or small that sample might be. It’s now possible to be much more accurate about back links and that kind of knowledge is vital to the success of your SEO project.
If you haven’t already incorporated Google’s Webmaster Tools into your search engine optimization strategy, now is the time to do so. Google has made it clear that they want webmasters to pay attention to these metrics and there has never been a better time for you to get involved. For more information, tips, tricks and developments, check out the Official Google Webmaster Central blog.