Google Webmaster Tools: 5 Reasons to Start Using It Today

Excerpts from my latest article at Practical eCommerce, Top 5 Reasons to Use Google Webmaster Tools.

There’s a common misconception that registering with Google Webmaster Tools somehow enables Google more access to information about a site. In fact, just the opposite is true. Google Webmaster Tools provides site owners access to data available from no other source, data every ecommerce site needs to manage its organic search channel.

  1. Organic Impressions: A critical metric to paid search campaigns, impressions have been painfully absent for organic search programs. Google, however, gives search engine optimization professionals a glimpse into impressions in the “Search Queries” report under “Your Site on the Web.”
  2. Real Ranking Data: In the very same Search Queries reports for top queries and top pages, Google also provides average ranking data. This is a fantastic metric to have, with so many ranking tools falling prey to personalization biases.
  3. Complete Backlink Reporting: Reliable backlink data is becoming increasingly hard to find. With Yahoo! Site Explorer’s sad demise and Bing’s refusal to honor the link: query, Google has become the only major U.S. search engine to give any information on backlinks.
  4. +1 Metrics: Google offers a trio of Google +1 reports, including “Search Impact,” “Activity” and “Audience.” The search impact report details how many impressions were annotated with a friend’s +1, and how many clicks those +1 annotated search results drove.
  5. New Crawl Error Reports: Just this week Google expanded their crawl error reporting in an attempt to make it more actionable. This report has always been somewhat problematic because it lists errors that sometimes have a very valid reason for existing.

Read the article in full at Practical eCommerce with more detailed descriptions of the features and illustrations »

BONUS 6TH REASON: Robots.txt Testing: Test changes to your robots.txt files before posting them live and discovering you disallowed your key product lines. You can find this indispensible tool under “Site Configuration” and then “Crawler Access.” Never update a robots.txt file without testing it with this tool first.

Web PieRat logo.

Originally posted on Web PieRat.

Making Sense of SEO Keyword Research with Mapping

My latest article at Practical Ecommerce, read it in full here.


Keyword research is essential to search engine optimization. It’s the window into the words that real searchers use to find products like the ones you sell. But at the end of the keyword research process — detailed in “Part 1: Keyword Research” — search marketers can be overwhelmed by the vast amount of data staring at them from their Excel spreadsheets. Keyword categorizing and mapping help move the optimization process from the research phase to the actual optimization phase.

Categorizing Keywords

During the keyword research process, patterns start to appear. Different types of keywords emerge that can be logically grouped into different categories that reflect the site’s business goals and core product offerings. For example, if my site sells subscriptions to online games for kids, my keyword research could be 12,000 phrases or more based on the research conducted in Google’s free Keyword Tool. But because each keyword is needs to be related to my core product offering, I can start to categorize them and delete the ones that aren’t directly relevant.

Let’s say that my site sells games. But it doesn’t sell just any games; it sells online games for kids. That’s three vital components to choosing keywords that are specifically targeted to my product offering: “types of games,” “online vs. offline,” and synonyms for the word “kids,” as listed in the spreadsheet, below.

See the diagrams and read more on how to categorize & map keywords »

Web PieRat logo.

Originally posted on Web PieRat.

8 SEO Reasons to Crawl Your Sites

My latest article at Ecommerce Developer, read it in full here.


The first thing I do when working with a new site is set my favorite crawler on it. This gets me acquainted with all the URLs, site sections, interlinkings, forgotten pockets, scars and warts. A good crawler offers a wealth of data useful not just to search engine optimization, but also to site maintenance in general.

Luckily, some great crawlers are free. You’ll find pages of options just by Googling “web crawler” or a similar term. Xenu Link Sleuth is my favorite for the price — it’s free — and for the broad assortment of data collected on every URL it crawls. GSite Crawler is another good, free alternative. It’s focused mainly on creating XML sitemaps and feeds, but it’s good for other uses as well.

Read more »

Web PieRat logo.

Originally posted on Web PieRat.

Excel Plug-ins Cut My Mass Keyword Research Time by 75%

Richard Baxter’s Google Adwords API Extension for Excel plug-in has completely turned my keyword research world upside down, requiring a new set of tools to cope with massive data overload. I’m not complaining, it’s an awesome plug-in and free to boot. But new tools require changes in the processes and other tools used to wrangle into a form that’s usable. With a combination of Baxter’s AdWords API extension, DigDB plug-in for Excel, and good old fashioned waiting for processing to complete in Excel I’ve cut my high-volume keyword research time by 75%.

By high-volume keyword research I mean identifying related keywords for thousands of known phrases, which while fantstically tedious is also quite useful for finding new keyword markets to optimize for.  For example, within five minutes of reading Baxter’s post on SEOgadget and downloading the plug-in, I was happily fetching keyword ideas for 46,000 phrases. Why 46,000? Well, I had a list of 200 cities, 150 topics and 4 different phrasings for each combination of city and topic. Adds up quickly does’t it? Thank god for MergeWords. But back to the AdWords API plug-in.

Continue reading “Excel Plug-ins Cut My Mass Keyword Research Time by 75%”