Is It Duplicate Content or Just Undifferentiated?

More on one of my favorite topics: duplicate content. Finding it, finding its source, fixing it — it’s like a big geeky puzzle.

Excerpts from my latest article at Resource Interactive’s weThink blog: “Duplicate Content: Destroy or Differentiate.”

Duplicate content is an often misunderstood part of search engine optimization. Most digital marketers know it’s bad, but why? Duplicate content (two or more pages with different URLs that display the same content) makes it harder for a site to rank and drive traffic and conversions.

 

If 20 URLs each display the same page of content for red shoes, then all the links pointing to that page across a site are split across 20 different URLs. The page would have much more ranking power if all those links were pointing to a single URL. And that’s just internal links; consider the impact of splitting more valuable links from other sites across multiple URLs for the same page. Then add on the Facebook Likes, tweets, +1s, blog links and other actions that signal popularity to search engines, all split across 20 different URLs for that single page of content. In addition, duplicate content burns crawl equity, slowing a search engine’s progress as it crawls through a site to discover fresh new content.

But sometimes, content only looks like it’s duplicate. This is a common issue with ecommerce platforms that offer filtering options for better usability. The filters tend to create new slices of category content that look the same to search engines as the original default category page. For example, a category page of red shoes might have a filter for shoe style that includes tennis shoes, slip-on shoes, flats, high heels, etc. These are valuable pages to shoppers and searchers alike. But search engines can only determine the differences between each of the filters if the page sends differentiating signals in its title tag, headings and other textual content on the page.

Read more to find out how to tell if content is duplicate of merely needs differentiating: “Duplicate Content: Destroy or Differentiate.”

Read the article in full at Resource Interactive’s weThink blog »


Web PieRat logo.

Originally posted on Web PieRat.

Syndicating Content for SEO Benefit

My latest article at Practical Ecommerce, read it in full here.

When ecommerce companies think about content syndication, they typically consider acquiring content that others have written to beef up their own sites. Depending on the goal, placing content from other sites onto your own can be beneficial from a branding, partnership, or reference point of view, but rarely for search engine optimization.

By its nature, content syndication tends to create duplicate content — I addressed that topic here previously, at “SEO: There Is No Duplicate Content Penalty” — because one site creates content and one or more sites post that content to their own sites. But it doesn’t have to be duplicate content.

Read more »


Web PieRat logo.

Originally posted on Web PieRat.

8 SEO Reasons to Crawl Your Sites

My latest article at Ecommerce Developer, read it in full here.

 

The first thing I do when working with a new site is set my favorite crawler on it. This gets me acquainted with all the URLs, site sections, interlinkings, forgotten pockets, scars and warts. A good crawler offers a wealth of data useful not just to search engine optimization, but also to site maintenance in general.

Luckily, some great crawlers are free. You’ll find pages of options just by Googling “web crawler” or a similar term. Xenu Link Sleuth is my favorite for the price — it’s free — and for the broad assortment of data collected on every URL it crawls. GSite Crawler is another good, free alternative. It’s focused mainly on creating XML sitemaps and feeds, but it’s good for other uses as well.

Read more »


Web PieRat logo.

Originally posted on Web PieRat.

“Search Friendly” Ecommerce Platforms?

After working with ecommerce SEO clients for several years, I’m about fed up with “search friendly” platforms, content management systems & site features. They all come with their own baggage, their own set of issues that need to be tested for and controlled. Which takes IT resources. Which are hard to come by, especially when you’re talking about something as difficult to determine ROI for as structural SEO updates.

And invariably, the client isn’t jazzed about revising their platform to optimize structural SEO issues because they thought they bought something that was good for SEO in the first place.

The problem is, SEO friendliness is more than enabling automated & manually customizable title tags. It’s how functionality impacts URLs, how many URLs are generated for each page of content, whether the navigation path affects URL structure, how tracking parameters are passed, whether sorting modifies the URL, whether categories and products are assigned unique persistent IDs, whether IDs are reused, and many many more questions that impact SEO.

I’m looking for the platform that enables and enforces a single URL for a single page of content that is system-optimized for a unique automated keyword phrase, which can be overwritten by manual optimization. ONE URL, ONE page, ONE unique keyword theme. Across thousands of pages on an ecommerce site.

If there’s a platform out there that does this without heavy customization, I haven’t worked with it yet. Pray tell, what’s your favorite “search friendly” platform?


Web PieRat logo.

Originally posted on Web PieRat.

Redesign with SEO in Mind

A site redesign or switch to a new platform is kind of like a rebirth – it’s one of the most exciting and nerve-wracking times for the entire Internet marketing team. With everyone caught up in the branding, design, usability and technology, the impact on SEO can sometimes be forgotten until the last minute.

I wrote this article on redesigning a site with SEO in mind back in July for MultichannelMerchant.com and gave up looking for it to be published… so I missed its publish date in September. Maybe you did too. Here’s a redux of the original article.

While it’s difficult to determine what the natural search impact will be until working code hits a development server, keeping several mantras in mind and repeating them liberally will keep the team focused on the most critical elements to plan for SEO success. I love these mantras — I actually say them to myself as I’m auditing sites.

SEO Development Mantras

  1. Links must be crawlable with JavaScript, CSS and cookies disabled.
  2. Plain text must be indexable on the page with JavaScript & CSS disabled.
  3. Every page must send a unique keyword signal.
  4. One URL for one page of content.
  5. We’re going to 301 that, right?

When a site is stable on the development environment and the URLs are ironed out, identify a 301 redirect plan, build the new XML sitemap and make sure you have a measurement plan in place to measure the impact of the relaunch.

All this in more detail at the original article at Multichannel Merchant: https://multichannelmerchant.com/ecommerce/0901-using-seo-redesign/index.html


Web PieRat logo.

Originally posted on Web PieRat.