Google’s Ever-Changing Algorithms

Excerpts from my latest article at Resource Interactive’s weThink blog: “Site Quality & Google’s Ever-Changing Algorithms.”

Google released 52 algorithm updates and changes in April 2012—1.73 per day. The Panda and Penguin updates received the most attention because they affected the most sites, but 50 other updates impacted search results as well. Most were focused on Google’s crusade to improve the quality of sites that rank the highest in their search results, but others included updates to changes in indexing, spelling, sitelinks, sports scores features and more. Of course, April 1st also brought a new round of Google’s April Fools’ pranks including the super geeky 8-bit Google Maps and Google Australia’s Street Roo instead of the usual Street View.

The most important updates in May, however, focused on site quality and spam prevention. First, Google quietly released the Panda 3.5 algorithm update on April 19. The Panda update, named for the Google engineer who developed it, targets sites with thin content or that repost content found on other sites. This 3.5 update is just the latest of the Panda releases that tend to happen every four to eight weeks.

Sites that create fresh, unique content on a regular basis—such as ecommerce sites that release new products regularly and write their own unique product descriptions—shouldn’t have much trouble with Panda updates. Consumer product sites that feature unique content about their branded products as well as blog posts, Twitter feeds and other sources of unique content should likewise have no Panda problems.

A few days later on April 24, Google released another important algorithm update they codenamed “Penguin” that intensified Google’s war on webspam. Three percent of Google’s search rankings were affected in the U.S. The algorithm update seems to have hit sites that have overoptimized anchor text and low anchor text diversity, as well as a high number of links from topically irrelevant sites.

Read the article in full at Resource Interactive’s weThink blog »


Web PieRat logo.

Originally posted on Web PieRat.

10 SEO Geotargeting Tips Plus a Webinar

Excerpts from my latest article at Resource Interactive’s weThink blog: “10 Tips for Sending International SEO Signals.”

 

Search engine optimization for multinational sites isn’t all that different than SEO for a single U.S. site. It all boils down to the same three pillars: getting indexed, being relevant by using the most popular keywords and being popular by acquiring links and social mentions. That said, the specifics of those three pillars are different for multinational sites versus a single U.S. site.

Each country has its own language, languages or even dialects within a language. Each country has a different mix of search engines that are popular among its citizens. China has Baidu, Japan still clings to Yahoo and most of the Americas, Europe and Africa prefer Google. In addition, international SEO multiplies the challenge of optimizing a single site by the number of countries targeted. A site for a single country with a host of SEO issues will likely have 10 times the issues or more when multiplied across 10 countries and languages. International SEO is an incredibly complex challenge, but these 10 geotargeting tips will get you started.

1.    Site Structure: Where to host each country’s content is the first critical question. The best option for SEO is a subdirectory such as www.site.com/en-uk/ and www.site.com/fr-ca/. Hosting content on a single domain enables each country to benefit from the links acquired by the other countries and the domain as a whole. Register the ccTLD—the top-level domain for each country—as a defensive measure and for promotion, and redirect that ccTLD to the content at the subdirectory. For example, site.co,uk would redirect to www.site.com/en-uk/. Country content can also be hosted at the ccTLD or a subdomain, but each comes with its own drawbacks.

Nine more geotargeting signals to come! Read the article in full at Resource Interactive’s weThink blog »

International SEO WebinarIf you’d like to see a webinar on this topic, head on over to Practical eCommerce for my recent presentaton on International Search Engine Optimization. The archived presentation is free, but you’ll need to log in to Practical eCommerce (also free) to view it. Enjoy!


Web PieRat logo.

Originally posted on Web PieRat.

Is It Duplicate Content or Just Undifferentiated?

More on one of my favorite topics: duplicate content. Finding it, finding its source, fixing it — it’s like a big geeky puzzle.

Excerpts from my latest article at Resource Interactive’s weThink blog: “Duplicate Content: Destroy or Differentiate.”

Duplicate content is an often misunderstood part of search engine optimization. Most digital marketers know it’s bad, but why? Duplicate content (two or more pages with different URLs that display the same content) makes it harder for a site to rank and drive traffic and conversions.

 

If 20 URLs each display the same page of content for red shoes, then all the links pointing to that page across a site are split across 20 different URLs. The page would have much more ranking power if all those links were pointing to a single URL. And that’s just internal links; consider the impact of splitting more valuable links from other sites across multiple URLs for the same page. Then add on the Facebook Likes, tweets, +1s, blog links and other actions that signal popularity to search engines, all split across 20 different URLs for that single page of content. In addition, duplicate content burns crawl equity, slowing a search engine’s progress as it crawls through a site to discover fresh new content.

But sometimes, content only looks like it’s duplicate. This is a common issue with ecommerce platforms that offer filtering options for better usability. The filters tend to create new slices of category content that look the same to search engines as the original default category page. For example, a category page of red shoes might have a filter for shoe style that includes tennis shoes, slip-on shoes, flats, high heels, etc. These are valuable pages to shoppers and searchers alike. But search engines can only determine the differences between each of the filters if the page sends differentiating signals in its title tag, headings and other textual content on the page.

Read more to find out how to tell if content is duplicate of merely needs differentiating: “Duplicate Content: Destroy or Differentiate.”

Read the article in full at Resource Interactive’s weThink blog »


Web PieRat logo.

Originally posted on Web PieRat.

Using Social Signals to Personalize Organic Search

Excerpts from my latest article at Resource Interactive’s weThink blog: “How Social Media Boosts Organic Search.”

Search engines like Google develop algorithms to determine the quality of a site’s content as well as its contextual relevance and link popularity. Site quality is a pretty nebulous concept for a piece of software to understand, but search engineers have linked social signals such as Facebook’s Likes, shares and comments, Google+’s shares, +1s and comments, and Twitter’s tweets and retweets to the quality of the page being shared. The more shares, the higher quality a page must be. There are other quality signals in play as well — hundreds of signals factor into each engine’s algorithm — but social signals are thought to be harder to manipulate than linking signals.

The most obvious way that social signals impact search results is in each individual searcher’s personalized search. For example, a Google search for “social search” returns different search results depending on whether I’m logged in to my Google account. On the left below are the search results I see when I’m logged out of Google search. On the right below are the results for the same search when I’m logged in to my Google account.

The point is that I may be the only person who will see this exact personalized search result. My circle of friends in Google+ shared 130 items relevant to the phrase “social search.” To have the same set of results, you would have to have those same 130 friends in your Google+ circles….

Read the article in full at Resource Interactive’s weThink blog »


Web PieRat logo.

Originally posted on Web PieRat.

Rich Snippets Add Eye-Catching Bling to Organic Search Results

Excerpts from my latest article at Resource Interactive’s weThink blog: “Using Rich Snippets to Attract More Search Customers.”

Top rankings can be hard enough to achieve sometimes, but the battle for organic search clicks doesn’t stop with high rankings in Google and Bing. Search result bling, more commonly known as rich snippets, draws searchers’ eyes by adding visual flair to the plain blue link and black description that usually make up a search result snippet. The most commonly seen rich snippet adds yellow reviews stars next to some search results, but recipes, music, software applications, mug shots, videos and more can all be incorporated into search results as rich snippets.

 

For example, which of these search results for Jessica Simpson’s Evangela shoes grabs your eye? Heels.com tops the organic search results with a video rich snippet at position 1, and Zappos is dead on their heels (pardon the pun) with a video snippet of their own. The number 3 result is a plain snippet for Heels.com that gets lost in all the visual competition, and Google claims the 3.5 position for its own visually enhanced Google Shopping results.

Find out more about rich snippets and how to implement them. Read the article in full at Resource Interactive’s weThink blog »


Web PieRat logo.

Originally posted on Web PieRat.