Will you +1? I will.

 

Google has taken a lot of criticism for failure to launch in the social sphere. Some cite the launch of +1 for websites as the most recent example of missing the boat. But I say it’s ok to launch a social signal before a (second) social platform. They serve different purposes.

I agree that Google Wave was awful, at least in terms of roll out. How can you possibly expect a social platform to succeed when users’ friends and coworkers don’t receive invitations? Social just isn’t social without your network. Not to mention its freakish complexity.

So how, then, can +1 succeed without a social network behind it? In its far more limited scope, just having contacts in Gmail constitutes a network.

As someone who uses Google products regularly, from Gmail and Google calendar to Google docs and Android, not to mention AdWords and search, the number of people associated with my Google account is lightyears ahead of the number I could find on our invite to Wave. Which was 6. And they weren’t the 6 people I needed to collaborate with most. Shrug.

The lower threshold of participation in +1, as well as its simplicity, is already a big step in the right direction. While +1s won’t impact rankings (today), the potential for searchers to see their friends’ and coworkers’ +1s in search results would be a powerful way to encourage click through to your listing vs. a competitor’s.

With a single click you can recommend that raincoat, news article or favorite sci-fi movie to friends, contacts and the rest of the world. The next time your connections search, they could see your +1’s directly in their search results, helping them find your recommendations when they’re most useful.
— Google Blog

Granted, the effectiveness of +1 as a differentiator for your search marketing efforts depends on your audience’s (and their networks’) adoption of Google products. But for the relatively low price of adding a button, it’s worth a try in my books. I’ve got the request in to my dev team already.

Now the problem is juggling all these darned buttons without hopelessly cluttering the design – 2 for Twitter, 2 for Facebook, one for email, now +1. Isn’t there a more elegant solution that doesn’t just roll them all up into one invisible “add this”? Bueller?


Web PieRat logo.

Originally posted on Web PieRat.

Cache in Hand

Google’s cache view is a valuable window into how googlebot “sees” a site. I find myself stalking the cache every couple of days in an effort to untangle architectural challenges to my SEO objectives. When I’m having difficulty helping colleagues understand why content or links are or aren’t crawlable, I often take them to the cache view as a quick and easy visual. Once they see what the bots see, from the googlebot itself, the conversation around how to resolve the issue is usually much easier. I wanted to include it in my article on advanced search operators at Pratical eCommerce last week, but I hit the word count cap. So here’s the scoop on cache.

A site’s architecture and the technology choices its development team choose can make or break the bots’ ability to crawl a site. The cache view offers a quick window into the bots’-eye view. For example, most humans surfing with a modern browser that incorporates JavaScript and accepts cookies will see Dell‘s homepage like this:

Dell.com HomepageDell Homepage As a human I am able to use the drop down menus to navigate to the main areas of the site, quickly consume many of Dell’s priority messages from the static feature boxes and the flash carousel, and browse the basic HTML links toward the bottom of the page. Dell makes its marketing priorities very clear and easy to understand… for humans with modern browsers. But what about the bots? What content can they consume? Let’s take a look at the cache view [cache:www.dell.com]:

With the cache view the page looks remarkably similar. There’s a gray header at the top of the page indicating that Google last cached this page on Oct 4, 2010 18:22:07 GMT, one hour and one minute ago at the time of this article. So any changes that Dell made to the site in the last 61 minutes will not be reflected in this cache view. That’s a very important note when you’re trying to confirm the crawlability of some new architectural change — make sure the change has been cached before you start analyzing the cache view.

Second thing to consider is that the cache view shows a far more human-centric view of the page than I’d expect. That’s because the initial cache view is still using your modern browser to execute the Javascript, CSS and cookies that the cached page calls. To see the bots’-eye view more realistically, we need to disable those by clicking on the “Text-only version” link in the upper right corner in the gray box. Now we see:

Now we’re seeing the textual version of the site, stripped of its technical finery. The rollover navigation in the header no longer functions. The links to main categories are still crawlable as plain text links, but the homepage doesn’t pass link popularity down to the subcategory pages. Depending on the marketing value of those pages, the lack of link juice flowing there could be an issue. The next thing we see is that the big lovely flash carousel, so front-and-center for human consumption, doesn’t exist without JavaScript enabled. Assuming the pages displayed in the flash piece are valuable landing pages, which they likely are to warrant homepage coverage and development time, this again is a missed opportunity to flow link juice to important pages. Both of these issues, the navigation and the flash carousel, could be coded to degrade gracefully using CSS to provide the same crawlable text and links to bots as well as humans.

Just to be safe any issue I see in cache view (or any issue that I don’t see that I expect to see) I double check as well by manually disabling my JavaScript, CSS and cookies; and I also set my user agent to googlebot. For more detailed information on the FireFox plugins I use to do this, see Surfing Like a Search Engine Spider on Practical Ecom. Cache view is a quick way to investigate to decide if a deeper analysis is required.

Note: The cache: operator only works on Google, but Yahoo Site Explorer offers a cache link on each page in its report as well. Bing does not support the cache: operator.


Web PieRat logo.

Originally posted on Web PieRat.

95% Drop in Search Counts on Google AdWords Keyword Tool

 

Google AdWords quietly updated its Keyword Tool on Sept. 7, resulting a dizzying drop of up to 95% in search counts for some phrases. For those who rely heavily on the keyword tool to predict potential SEO performance increases or to identify phrases to optimize for, this update will come as a heavy blow. Details are sketchy from Google, and the SEO community has been silent on the issue except for Dave Naylor and a few folks in the forums. So what happened?

“If you use both the previous and updated versions of the Keyword Tool to search for keywords, you may notice differences between the tools for statistics on Global Monthly Searches and Local Monthly Searches. This is because the previous version of the Keyword Tool provides search statistics based on Google.com search traffic and traffic from search partners, while the updated version of the Keyword Tool provides search statistics based on Google.com traffic only.  We’ve updated these statistics based on user feedback, and hope you find them helpful for keyword selection.” >> from AdWordsPro on the AdWords Help Forum

The Search Network: Your ads may appear alongside or above search results, as part of a results page as a user navigates through a site’s directory, or on other relevant search pages. Our global Search Network includes Google Maps, Google Product Search and Google Groups along with entities such as Virgin Media and Amazon.com. >> from AdWords Help

OK, so previously the data was based on an aggregation of Google.com searches and traffic from search partners like Amazon and Virgin Media. But since the Sept. 7, the data is based solely on Google.com searches. Am I the only one who would have found that incredibly helpful to know before now that a data source as large as Amazon’s was skewing the data in the Google keyword tool?

Yes, we all knew/theorized/suspected that the AdWords keyword tool data was skewed. And given the source (Google) and the purpose of the tool (get advertisers to pay for ads on keywords) we took it with a grain of salt. But seriously, 95% inflation is a very, very big grain of salt.

If you don’t already, this is a wake up call to diversify your SEO data sources and understand that most public SEO data is relative rather than exact. First, diversification. Try using more than one data source for keyword data. There are a lot of tools out there, but these are the old stand-bys: WordTracker, Keyword Discovery, SEO Book’s tool. Keep in mind the source of each tool’s data — they either use the APIs from the engines themselves or toolbar & ISP data, which renders each pretty much as reliable as the Google keyword tool anyway.

Which brings us to relativity. Get used to the reality that you will never know the exact number of searches that occur for any given keyword or phrase. Public keyword data and search frequency data are imprecise. The only data you can be certain of in SEO is the data that comes from your own log files on your own site. All other data is approximate. If you base projections on approximations, be certain that you take into account the squidginess of the data sources.

In this update, the biggest differences between pre and post appear to be on the larger, more competitive head terms, both in this sample and in the analysis I’ve been doing over my own keyword sets. Here’s an example. Let’s pretend we’re researching keywords for a site that offers hotel reservations. I might target sampling of keywords like these.

google adwords keyword tool update examples
Click to view larger. Data source: Google AdWords Keyword Tool, Exact Match, US English.

Had I taken the Google AdWords Keyword Tool at face value, I would expect there to be a US English keyword market of 450,000 searches for the phrase [san francisco hotels]. I might choose project to win 10% of those searches and figure a conversion rate of 5% based on my analytics. Each conversion might bring me say $10, so I might project a value of (((450,000 * 10%) *5%) *$10) = $22,500 in a month.

Unfortunately, the updated number based on true Google.com searches, according to the tool in September, is 95% lower at 33,100. So now we’re looking at (((33,100 * 10%) *5%) *$10) = $1,655 in a month. Not bad for a single keyword in a single month, but much lower than the previous projection and it comes with hefty competition.

For a detailed description on safer ways to project, see SEOmoz’s post: Predicting Yearly Site Traffic.


Web PieRat logo.

Originally posted on Web PieRat.

Will Diverting Transactional Intent Preempt Search

“The battle for search is over for now — Google won — but the battle for the underlying revenue is just heating up,” according to Alex Rampell, the CEO of TrialPay. I hadn’t thought if it this way before, but his thought provoking article at TechCrunch, “Preempting Search,” got me thinking.

Rampell’s premise is that:

For Google’s enemies, the best way of hurting the search goliath is not to build a better search engine, but rather to give people a reason to stop searching for a wide class of goods and services by preempting search on Google. Given Google’s dependence on harvesting “transactional intent” for its revenue, the key is to move transaction initiation off of Google.

I wonder though, people are so lazy. Is it really possible that they will circumvent searching in favor of a multitude of transactional and vertical niche sites if Google gives them easy access to those niches in a single easy place that they trust? With Google’s reputation for quality search results, whether one agrees or not, it’s so much easier for the masses to just search and trust Google to show them where to go. Perhaps each individual might have a soft spot for a certain transactional site or vertical (I admit I’m an Amazon & Expedia devotee), but I can’t believe that most searchers would take the time to develop a portfolio of trusted individual sites to meet their individual needs.

Still, if each person chooses one transactional or vertical site over Googling, that could represent a wearing away of Google’s revenue hold to a thousand niche players. Interesting to ponder. Thoughts?


Web PieRat logo.

Originally posted on Web PieRat.

Site Speed’s Role in Google Rankings

Reproduced from my April 19 article on Practical Ecommerce:

In its continuing quest to provide searchers with the best possible search experience, Google announced last week that site speed is now a signal in its search ranking algorithms. Along with the hundreds of other signals, like link popularity and keyword relevance, Google is now factoring site speed (i.e., how quickly a website responds to web requests) into its rankings on search results pages.

Why Does Google Care About Speed?

Searchers presumably associate the quality of the page they land on with Google’s brand. If a page that Google ranks isn’t topically relevant, the searcher’s Google brand experience is negative. Google is taking that a step farther with this algorithm update, implying that a site that is slow to respond or load will also result in a negative search experience that reflects poorly on Google.

It is true that bounce rates are higher on slow sites, indicating that searchers find the experience less acceptable. And Google is thought to factor bounce rates (quick returns from a search landing page back to Google) into their algorithms. Site speed, then, would just be an extension of that logic.

What About the Ugly Factor?

As a search user, I’m all for anything that encourages increased site speed. But, as a marketer and SEO professional, this feels like a slippery slope. Ugly sites also experience higher bounce rates – perhaps there should be an “ugly” factor in the algorithm. Obviously, I jest. “Ugly” is impossible to define empirically and quantify outside of the bounce rate data that “ugly” would theoretically increase.

Regardless of how marketers feel about the site speed addition to the algorithm, it’s live now. Google has introduced the site speed factor to only approximately 1 percent of the search queries performed today, so the likelihood that any one site will notice a dramatic shift today is extremely small.

Google rolls these changes out slowly to avoid the sudden massive shifts in rankings that we used to see several years ago. While this is a welcome change, it does make it harder to pinpoint source of the bounce issue, if there is one. As a result, a site with major site speed issues may have trouble discerning that speed actually is the issue. Impacted sites will more likely notice a gradual decline in rankings and organic search-referred traffic and sales as the site speed factor gradually impacts higher percentages of search queries.

Test Your Site Speed

Google Webmaster Tools recommends an assortment of tools to test site speed to determine if there is an issue, including Google’s Page Speed, Yahoo! YSlow, and WebPageTest. If site speed is indeed an issue according to these tools, a host of business or technical challenges could be contributing. Ecommerce sites tend to include a multitude of analytics and usability feature scripts that can contribute to slow response times, as well as hosting configurations that limit bandwidth available for a site.

These business and technical challenges tend to be the most difficult to resolve, especially for sites without dedicated IT resources. But, don’t panic – test first. If site speed is not an issue today, then file this away for discussion in the future as new features and technologies are added to the site. If site speed is an issue, then it’s probably something the business has been discussing already, since it impacts overall user experience. SEO revenue is yet another reason to prioritize the discussions around site speed and to make technical upgrades that could improve SEO as well as user experience.


Web PieRat logo.

Originally posted on Web PieRat.