Jump to content

How to Use Search-Engine-Supplied SEO Tools

+ 2
  MaryO's Photo
Posted Nov 03 2009 03:45 PM

Take advantage of the free tools provided by search engines to improve your overall SEO. This excerpt from The Art of SEO reviews some of the currently available tools and their suggested use.


All three major search engines make an active effort to communicate with webmasters and publishers and provide some very useful tools for SEO professionals.

Search Engine Webmaster Tools

Using the Webmaster Tools provided by Google and Bing is a great way to see how the search engines perceive your site.

First, let’s clear up a popular misconception. Setting up and using a Google Webmaster Tools or Bing Webmaster Tools account provides no new information about your site to the search engines. That is not the purpose of these tools.

You can easily get comfortable with this when you learn how the tools are set up. You can create a Webmaster Tools account with either Google or Bing quite easily. An important part of creating these accounts is verifying your ownership of the site. To do that you have two options:

  • Place a search-engine-supplied meta tag on the home page of your domain.

  • Place a special file as specified by the search engine in the root directory of your web server.

Neither of these actions changes or enhances the search engine’s ability to access information about your website.

The intent of these tools is to provide publishers with data on how the search engines view their sites. This is incredibly valuable data that publishers can use to diagnose site problems. We recommend that all publishers leverage both of these tools on all of their websites.

For the next few pages, we will take a look at both of these products.

Google Webmaster Tools

Figure 11.2 shows the type of data you get just by looking at the opening screen once you log in to Google Webmaster Tools (http://www.google.com/webmasters/tools).

Figure 11.2. Google Webmaster Tools opening screen

Attached Image

You can find valuable data in the “not found” report. You can access this by selecting Diagnostics→Crawl Errors→Not Found, as shown in Figure 11.3.

Figure 11.3. Google Webmaster Tools “not found” errors

Attached Image

You can see a complete list of URLs that Googlebot encountered somewhere on the Web where your web server returned a 404 error instead of a web page. Sometimes this happens because another publisher attempting to link to your site does not implement the link correctly.

For that reason, one great way to pick up some links is to simply implement a 301 redirect from the bad URL to the correct URL. If you can’t figure out the correct URL you can always redirect to the site’s home page or the corresponding category one level up from the old URL. Before you do, though, check the pages that are linking to the broken pages, to make sure they represent legitimate links.

Another cool diagnostic tactic is to look at the error URLs in your Sitemaps file, as shown in Figure 11.4.

Figure 11.4. Google Webmaster Tools “Sitemap errors” report

Attached Image

With this data you can analyze the nature of the problem and resolve it. Figure 11.5 shows another look at the diagnostic data.

Figure 11.5. Google Webmaster Tools URLs restricted by robots.txt

Attached Image

The error shown in Figure 11.5 is common, and occurs when sites mistakenly restrict access to URLs in their robots.txt file. When that happens, this report can be a godsend. The report flags pages it finds referenced on the Web, but that Google Webmaster Tools is not allowed to crawl.

Of course, this may be what you intended, in which case there is no problem. But when it is not what you intended this report provides you with some quick prompting to fix it.

Google Webmaster Tools also gives you an inside look at potential problems with your meta description tags and your title tags, as shown in Figure 11.6.

Figure 11.6. Google Webmaster Tools “metadata errors” report

Attached Image

For example, if we dig a little deeper on the duplicate title tags we see the screen shown in Figure 11.7.

Figure 11.7. Google Webmaster Tools duplicate title tags

Attached Image

Figure 11.7 shows that there are eight cases of duplicate title tags on the site. Six of them are on the blog and two of them relate to a podcast with Neil Patel. These should be investigated to see whether there are problems that can be resolved.

Next, let’s look at the “top search queries” report, shown in Figure 11.8.

Figure 11.8. Google Webmaster Tools “top search queries” report

Attached Image

Figure 11.8 shows Google’s view of the queries that are most searched on in terms of the site in question (in the left column), and the most frequently clicked queries for the site (in the right column). You can also see data from Google regarding what the site’s position was in the SERPs at that time. Note that this data is pretty limited, and most publishers will be able to get better data on their search queries from web analytics software.

Another report to look at is the “crawl stats” report, shown in Figure 11.9.

Figure 11.9. Google Webmaster Tools “crawl stats” report

Attached Image

The charts in Figure 11.9 look normal and healthy for the site. However, if you see a sudden dip that sustains itself, it could be a flag that there is a problem.

Of course, one of the most important reports available from Google Webmaster Tools is the one that depicts the links to the site, as shown in Figure 11.10.

Figure 11.10. Google Webmaster Tools report depicting links to the site

Attached Image

This report is the only way to get a good view of what links Google sees to the site. In addition, you can download the report in spreadsheet form, which makes it easy to manipulate the data. The final Google Webmaster Tools screen shot, shown in Figure 11.11 shows the Settings menu.

Figure 11.11. Google Webmaster Tools Settings menu

Attached Image

Here is what the various settings mean:

Geographic target

If a given site targets users in a particular location, webmasters can provide Google with information that will help determine how that site appears in the country-specific search results, as well as improve Google search results for geographic queries.

Preferred domain

The preferred domain is the domain the webmaster wants to be used to index the site’s pages. If a webmaster specifies a preferred domain as http://www.example.com and Google finds a link to that site that is formatted as http://example.com, Google will treat that link as though it were pointing at http://www.example.com.

Crawl rate

The crawl rate affects the speed of Googlebot’s requests during the crawl process. It has no effect on how often Googlebot crawls a given site. Google determines the recommended rate based on the number of pages on a website.

robots.txt test tool

The robots.txt tool (not shown in Figure 11.11, but accessible at Site Configuration→Crawler Access→Test robots.txt) is extremely valuable as well. The tool allows you to test your robots.txt file using test URLs from your site before you actually push the file live.

Bing Webmaster Tools

Microsoft also offers a Webmaster Tools product with some great features. With Bing Webmaster Tools (http://www.bing.com/webmaster) you can get a rich set of data on many of the same types of things as Google Webmaster Tools offers, but now you are getting the feedback from the Bing crawler. The viewpoint from a different search engine is a tremendous additional value-add. Figure 11.12 shows the opening screen for the product.

Already you can see some of the great things that are available. At the top right you see the number of pages Bing has indexed. You can also see some of the most important pages, their page scores, and the most recent crawl date. Figure 11.13 shows the “crawl issues” report.

Figure 11.12. Bing Webmaster Tools opening screen

Attached Image

Figure 11.13. Bing Webmaster Tools “crawl issues” report

Attached Image

You can pick from several options:

  • The 404 report lists pages that Bing tried to access, but for which your web server returned a 404 status code.

  • You can get a list of URLs blocked by robots.txt.

  • You can find out where you have long dynamic URLs that may be troublesome to the Bing crawler.

  • You can get a report of the pages on your site that appear to be infected with malware. Hopefully, you never need this, but if you do, it could be a lifesaver.

  • You can get a review of all the URLs on your site that are of an unsupported content type.

Bing Webmaster Tools also provides you with a look at the set of inbound links that Bing is aware of for your site, as shown in Figure 11.14

Figure 11.14. Bing Webmaster Tools “inbound links” report

Attached Image

The backlink tool is limited in that you can download only 1,000 results, but it does provide a filtering capability so that you can dig into your backlinks in more detail, as shown in Figure 11.15.

One unique feature is the ability to see which of your pages perform the best for a given search query within Bing, as shown in Figure 11.16.

Figure 11.15. Bing Webmaster Tools inbound link filtering

Attached Image

Figure 11.16. Bing Webmaster Tools page performance by keyword

Attached Image

This type of data can be effective in helping you understand your best optimization strategy if you are trying to win on a given term. In Figure 11.16 Netconcepts probably should not focus on the “SEO: Metrics That Matter” page to try to rank for the query SEO, because so many other pages on the site are more highly ranked for that term.

However, there is an additional opportunity here. Since all the listed pages are relevant to the search query, it might be interesting to review which pages in this list link back to the top page in the list, and the anchor text used. After all, all of these pages have relevance to the query. This data can help you with your internal linking strategies.

There is much more here than we have been able to show in just a few pages. Bing Webmaster Tools provides you with the unique opportunity to see an insider’s view of your website: to see it the way Bing sees it.

Yahoo! Site Explorer, Yahoo! Search Engine Link Commands

Yahoo! does not offer a Webmaster Tools product, but it does offer Yahoo! Site Explorer, and it is a very valuable tool indeed.

Within Yahoo! there are two ways to access link data: via Site Explorer, which provides a unique interface for browsing link data; and via the normal web search engine, which lets you apply many more advanced parameters and modifiers to your link-based queries.

Yahoo! Site Explorer

Yahoo! Site Explorer offers only a few basic features, including the ability to see:

  • A list of links that point to a given URL

  • A list of links that point to a given subdomain (e.g., Southernfood.about.com, Seomoz.org, or Reddit.com)

  • A list of links that point to a given root domain (e.g., *.about.com, *.seomoz.org, or *.reddit.com, including all their respective subdomains)

  • A modifier that removes links coming from internal pages, either on the subdomain or the root domain, or from sister sites

However, the biggest feature is the ability to download up to 1,000 links into a spreadsheet file. In particular, you can download such a list for any domain at all—it does not have to be a site for which you have control, and in fact, it can be your competitor’s website.

Using Site Explorer is simple. Just enter a given URL and go (or type a standard link or link domain search query into Yahoo! Search and you’ll be redirected). You’ll be given options to modify the domain parameters and exclusions in the results. Figure 11.17, “Yahoo! Site Explorer report sample” shows a sample report from Yahoo! Site Explorer.

Figure 11.17. Yahoo! Site Explorer report sample

Attached Image

Yahoo! Site Explorer’s biggest weaknesses stem from the lack of crucial data pieces, including:

Ordering

The links are given in “no particular order,” according to Yahoo!’s public representatives, though SEO practitioners generally believe they tend to show more important links before less important links (but certainly not in order from most to least valuable/popular/important/etc.).

NoFollow included

In a tragic move, NoFollowed links are included in the list with followed links, and no differentiation exists between the two, forcing SEO practitioners to do their own research page by page to determine which links the engines might actually be counting.

Target URL

Unless you choose to link to only a given URL, you don’t get to see which page on a domain/subdomain a particular link points to.

Anchor text

No anchor text is provided to show the linking term/phrase/alt attribute that pointed to the page.

Importance metrics

No indication of how valuable/important a particular link or domain might be is provided. Obviously, Google’s PageRank would be a strange one to show here, but Yahoo! used to have its own link-graph-based value, Webrank, which it quickly removed (likely only from public view) in 2004.

Despite these weaknesses, Yahoo! Site Explorer is at least valuable for browsing through a site’s links and getting rough information on the types of sites and pages pointing to the URL/domain. If you want more, you can always click through to each individual link for some of this additional data (though this is a time-consuming process).

Yahoo! Search

Despite the fact that Site Explorer is meant to be the flagship link-searching product, Yahoo!’s normal query system actually provides far greater functionality. This includes the ability to get:

  • A list of links to individual pages or entire domains (link:http://www.yourdomain.tld or linkdomain:yourdomain.tld)

  • Lists that exclude pages from certain domains or top-level domains (-site:domainx.com or -site:.co.uk)

  • Lists that include/exclude pages with certain keywords (-keyword, -intitle:keyword, -intext:keyword, keyword, intitle:keyword, or intext:keyword)

  • Lists that include/exclude pages/domains with certain attributes (-inurl:keyword or inurl:keyword)

  • Lists refined to certain domain extensions (site:com.tr)

  • Lists refined by geographic region (region:europe)

  • Lists of pages that contain links to other pages or domains (linkdomain:domainx.com, linkdomain:domainy.com)

Using Yahoo! Search to explore links requires modifying the standard link queries. If you simply use a query structure such as linkdomain:nbcolympics.com (to see links to a domain) or link:http://www.nbcolympics.com (to see links to an individual page), you’ll be redirected to Site Explorer. You need to combine these queries with additional parameters to get data from Yahoo! Search, as shown in Figure 11.18.

Figure 11.18. Yahoo! Search Linkdomain: command

Attached Image

The weaknesses here are the same as those for Site Explorer: no link targets, no anchor text, and no importance metrics. The only real advantage over Site Explorer is the ability to refine the query.

Here are some ways to use Yahoo!’s link data:

  • Determine the approximate number and strength of a site’s links (to help uncover the role that link strength/weakness is playing in rankings and search traffic).

  • Track relative link growth/shrinkage over time.

  • Uncover links that could be causing potential problems (paid links, spammed links, low-quality links, etc.) or drops in rankings (not necessarily because they made a site fall just by existing, but because their value might be removed).

  • Find links that point to pages that return a 404 error.

  • Determine the relative link popularity of a given page on a site compared to others.

  • Investigate the links to a domain or page from which you’re considering getting a link.

  • Find links to competitors that outrank a client for potential acquisition.

  • Find links that point with less-than-ideal anchor text to which you can request modifications.

  • Research a site’s internal link structure to find problems or opportunities.

  • Check links between link partners or multiple sites owned by an entity.

The Art of SEO

Learn more about this topic from The Art of SEO.

Four acknowledged experts in search engine optimization share guidelines and innovative techniques that will help you plan and execute a comprehensive SEO strategy. This second edition brings you up to date on recent changes in search engine behavior—such as new ranking methods involving user engagement and social media—with an array of effective tactics, from basic to advanced.

See what you'll learn


Tags:
0 Subscribe


1 Reply

0
  Krishna123_724's Photo
Posted Dec 15 2011 04:51 AM

Are these Search Engine Supplied SEO Tools are accurate and effective?


Link Building