Comment Spam Outing on TechCrunch

Personal Note: You may know that I am the SEO Manager for Groupon, a competitor/co-opetioner with some of the sites mentioned here. I found this example reading group buying industry news, not in an effort to spread stories about other sites in Groupon’s space. I would have felt compelled to post this perfect example of good and bad commenting even if it had been a story completely unrelated to my employer’s industry simply because it is related to the SEO industry.

Comment spam is one of the lowest forms of “link building.” Yet people continue to do it because it’s easy, because they think it works and because they don’t understand that many comment sections aren’t crawlable even if the links are followed. But sometimes comment spammers get outed, like in the comments on this TechCrunch post about the partnership between BuyWithMe and SCVNGR. I grabbed a couple of screen shots in case the thread is deleted to illustrate this example (click them to enlarge).

What’s wrong with these comments? If they are from real people with real opinions, nothing. But the same users allegedly posted these comments on multiple posts on the TechCrunch, leaving links to UrbanSpoils on each. Other readers recognized the tactic and called him/them out on it. Personally, I can’t find any duplicate comments from these users, but I didn’t look very long either. The point remains: Comment spam at your peril — is the link inserted valuable enough to risk the scorn of the other readers and commenters? My answer is no. And in this case at least, certainly not. Here’s why:

  • Disqus.com comments are fed into posts using JavaScript. The comment spam isn’t even crawlable in this instance (see cache). So if the goal was to seed links into posts to improve link popularity, #fail.
  • If the intent was to gain click-through traffic from other readers, it’s possible that some clicks were achieved before the outing. Afterwards, I would doubt that many clicked, at least not with the intent to transact. But only the log files know for sure. Still, I call this a #fail.
  • And lastly, even any positive brand recognition that positive comment mentions would have generated have very likely been more than negated by the tongue lashing from other commenters. So #fail.

Not all commenting is spam, of course. When you legitimately have something of value to add to a conversation and when you disclose your identity if you have a self-interest, then comment away. On the very same post, the co-founder of SayLocal comments, disclosing his identity and self-interest. You’ll note that no one flames his comment. It may be just as self-interested, but he has given us the ability to judge his comment honestly.


Web PieRat logo.

Originally posted on Web PieRat.

95% Drop in Search Counts on Google AdWords Keyword Tool

 

Google AdWords quietly updated its Keyword Tool on Sept. 7, resulting a dizzying drop of up to 95% in search counts for some phrases. For those who rely heavily on the keyword tool to predict potential SEO performance increases or to identify phrases to optimize for, this update will come as a heavy blow. Details are sketchy from Google, and the SEO community has been silent on the issue except for Dave Naylor and a few folks in the forums. So what happened?

“If you use both the previous and updated versions of the Keyword Tool to search for keywords, you may notice differences between the tools for statistics on Global Monthly Searches and Local Monthly Searches. This is because the previous version of the Keyword Tool provides search statistics based on Google.com search traffic and traffic from search partners, while the updated version of the Keyword Tool provides search statistics based on Google.com traffic only.  We’ve updated these statistics based on user feedback, and hope you find them helpful for keyword selection.” >> from AdWordsPro on the AdWords Help Forum

The Search Network: Your ads may appear alongside or above search results, as part of a results page as a user navigates through a site’s directory, or on other relevant search pages. Our global Search Network includes Google Maps, Google Product Search and Google Groups along with entities such as Virgin Media and Amazon.com. >> from AdWords Help

OK, so previously the data was based on an aggregation of Google.com searches and traffic from search partners like Amazon and Virgin Media. But since the Sept. 7, the data is based solely on Google.com searches. Am I the only one who would have found that incredibly helpful to know before now that a data source as large as Amazon’s was skewing the data in the Google keyword tool?

Yes, we all knew/theorized/suspected that the AdWords keyword tool data was skewed. And given the source (Google) and the purpose of the tool (get advertisers to pay for ads on keywords) we took it with a grain of salt. But seriously, 95% inflation is a very, very big grain of salt.

If you don’t already, this is a wake up call to diversify your SEO data sources and understand that most public SEO data is relative rather than exact. First, diversification. Try using more than one data source for keyword data. There are a lot of tools out there, but these are the old stand-bys: WordTracker, Keyword Discovery, SEO Book’s tool. Keep in mind the source of each tool’s data — they either use the APIs from the engines themselves or toolbar & ISP data, which renders each pretty much as reliable as the Google keyword tool anyway.

Which brings us to relativity. Get used to the reality that you will never know the exact number of searches that occur for any given keyword or phrase. Public keyword data and search frequency data are imprecise. The only data you can be certain of in SEO is the data that comes from your own log files on your own site. All other data is approximate. If you base projections on approximations, be certain that you take into account the squidginess of the data sources.

In this update, the biggest differences between pre and post appear to be on the larger, more competitive head terms, both in this sample and in the analysis I’ve been doing over my own keyword sets. Here’s an example. Let’s pretend we’re researching keywords for a site that offers hotel reservations. I might target sampling of keywords like these.

google adwords keyword tool update examples
Click to view larger. Data source: Google AdWords Keyword Tool, Exact Match, US English.

Had I taken the Google AdWords Keyword Tool at face value, I would expect there to be a US English keyword market of 450,000 searches for the phrase [san francisco hotels]. I might choose project to win 10% of those searches and figure a conversion rate of 5% based on my analytics. Each conversion might bring me say $10, so I might project a value of (((450,000 * 10%) *5%) *$10) = $22,500 in a month.

Unfortunately, the updated number based on true Google.com searches, according to the tool in September, is 95% lower at 33,100. So now we’re looking at (((33,100 * 10%) *5%) *$10) = $1,655 in a month. Not bad for a single keyword in a single month, but much lower than the previous projection and it comes with hefty competition.

For a detailed description on safer ways to project, see SEOmoz’s post: Predicting Yearly Site Traffic.


Web PieRat logo.

Originally posted on Web PieRat.

Give Value to Get SEO Value

I expounded on the SEO principle of “Give to Get” in my recent Practical eCommerce article. Sites aspiring to lasting SEO performance have to give value to get value. If the audience doesn’t get value from the content, they will not feel compelled to give value to the site in the form of links, comments, reviews, “likes” and other forms of participation that add uniqueness or popularity to a site.

After completing the article, I started playing devil’s advocate. There are some sites that create content of their own that fare quite well in the search results. These exceptions to the “give value to get value” rule are major sites that have the strength of a massive network of popular sites behind them, or sites who mash syndicated content with user generated content.

Shopping comparison sites and reviews sites are excellent examples of both exceptions. Consider BizRate, which is part of the Scripps Network with Shopzilla, HGTV, Food Network, etc. I’m not suggesting that their only strength lies in their network interlinking, but they certainly do use it to their benefit. Each of the other sites links at least to BizRate, and BizRate links only to Scripps. I’d do the same, of course. My only point is that it does give them a leg up and lessen the need for them to innovate as strongly to provide valuable unique content.

User generated content is another way around the content creation conundrum. Reviews sites are prime examples of this phenomenon. Epinions doesn’t write the product descriptions for the product reviews on its site, it provides a platform to mash syndicated product content with user generated reviews. Of course, as part of the Shopping.com network, Epinions does also benefit from the aforementioned network internlinking value.


Web PieRat logo.

Originally posted on Web PieRat.