Google AdWords quietly updated its Keyword Tool on Sept. 7, resulting a dizzying drop of up to 95% in search counts for some phrases. For those who rely heavily on the keyword tool to predict potential SEO performance increases or to identify phrases to optimize for, this update will come as a heavy blow. Details are sketchy from Google, and the SEO community has been silent on the issue except for Dave Naylor and a few folks in the forums. So what happened?
“If you use both the previous and updated versions of the Keyword Tool to search for keywords, you may notice differences between the tools for statistics on Global Monthly Searches and Local Monthly Searches. This is because the previous version of the Keyword Tool provides search statistics based on Google.com search traffic and traffic from search partners, while the updated version of the Keyword Tool provides search statistics based on Google.com traffic only. We’ve updated these statistics based on user feedback, and hope you find them helpful for keyword selection.” >> from AdWordsPro on the AdWords Help Forum
The Search Network: Your ads may appear alongside or above search results, as part of a results page as a user navigates through a site’s directory, or on other relevant search pages. Our global Search Network includes Google Maps, Google Product Search and Google Groups along with entities such as Virgin Media and Amazon.com. >> from AdWords Help
OK, so previously the data was based on an aggregation of Google.com searches and traffic from search partners like Amazon and Virgin Media. But since the Sept. 7, the data is based solely on Google.com searches. Am I the only one who would have found that incredibly helpful to know before now that a data source as large as Amazon’s was skewing the data in the Google keyword tool?
Yes, we all knew/theorized/suspected that the AdWords keyword tool data was skewed. And given the source (Google) and the purpose of the tool (get advertisers to pay for ads on keywords) we took it with a grain of salt. But seriously, 95% inflation is a very, very big grain of salt.
If you don’t already, this is a wake up call to diversify your SEO data sources and understand that most public SEO data is relative rather than exact. First, diversification. Try using more than one data source for keyword data. There are a lot of tools out there, but these are the old stand-bys: WordTracker, Keyword Discovery, SEO Book’s tool. Keep in mind the source of each tool’s data — they either use the APIs from the engines themselves or toolbar & ISP data, which renders each pretty much as reliable as the Google keyword tool anyway.
Which brings us to relativity. Get used to the reality that you will never know the exact number of searches that occur for any given keyword or phrase. Public keyword data and search frequency data are imprecise. The only data you can be certain of in SEO is the data that comes from your own log files on your own site. All other data is approximate. If you base projections on approximations, be certain that you take into account the squidginess of the data sources.
In this update, the biggest differences between pre and post appear to be on the larger, more competitive head terms, both in this sample and in the analysis I’ve been doing over my own keyword sets. Here’s an example. Let’s pretend we’re researching keywords for a site that offers hotel reservations. I might target sampling of keywords like these.
Had I taken the Google AdWords Keyword Tool at face value, I would expect there to be a US English keyword market of 450,000 searches for the phrase [san francisco hotels]. I might choose project to win 10% of those searches and figure a conversion rate of 5% based on my analytics. Each conversion might bring me say $10, so I might project a value of (((450,000 * 10%) *5%) *$10) = $22,500 in a month.
Unfortunately, the updated number based on true Google.com searches, according to the tool in September, is 95% lower at 33,100. So now we’re looking at (((33,100 * 10%) *5%) *$10) = $1,655 in a month. Not bad for a single keyword in a single month, but much lower than the previous projection and it comes with hefty competition.
For a detailed description on safer ways to project, see SEOmoz’s post: Predicting Yearly Site Traffic.
Originally posted on Web PieRat.
I’ve known this for some time now, it’s pretty obvious that Google was trying to get Advertisers (espically big brands) salivating over their traffic….
I would say 95% is conservative, but GREAT POST AND GREAT FIND!
Twit @angle45media
Great info and insight here, Jill.
I’m always a skeptic of actual numbers coming from that tool – only rely on it for apples-to-apples comparisons of search phrases.
You’re incredibly right about our dependence on the tool and the data from google. They control basically all the data and it’s up to them. I’ve had days where they decided to switch something on their end and it literally cut the legs out from under me. The big question though is what can we use since you point out and I agree that all the other tools basically rely on the same data. What do do? What do seo? ha, ha…
Thanks for one’s marvelous posting at https://webpierat.com/w/2010/09/21/95-drop-google-keyword-tool/! I certainly enjoyed reading it, you could be a great author.I will be sure to bookmark your blog and will come back down the road. I want to encourage you to ultimately continue your great writing, have a nice afternoon!
There are many tools available over world wide web to test your skills though but I always used to prefer Google Adwards tool to analyze my keywords.
I have also some objections in my mind about this tool and certainly I have found this tool not more effective but now Google is about to update many of its tool like Google Analytics, Google Adwards to meet the standards and quality.