Identical twins

Image via Wikipedia

A couple of months ago, I posted on the search engines' so-called "duplicate content penalty," where pages that contain similar words are often hidden from the search results because the search engines (rightly) conclude that searchers would rather see different pages. Recently, I was e-mailed a follow-up question about a particularly difficult aspect of the duplicate content penalty--when you have two keyword phrases whose landing pages really could be twins. What do you do then?

Here's an excerpt from the question I got from Andy:

One of our programs is in leadership development, so we'd like to optimize a set of pages around this (and related) keywords. However, people often use the term "management training" interchangeably with "leadership development." Therefore we were thinking of optimizing different pages for each of these different terms. The problem is that apart from differences in these two terms, the content on these pages would be identical, so the concern is that search engines would see the pages as duplicates and index only one of them. Would you suggest that we discard our idea of creating separate (and potentially duplicate) pages and instead just create one page?

I'm sorry, Andy, but your approach would probably prove problematic. In my opinion, you have at least two options (others might suggest more), either of which can work:

  • Do as you describe to use both keywords interchangeably within the same page. For less competitive keywords, that will probably work just fine. It's possible that the search engines are smart enough to know that those terms are similar, so it might work regardless of competition. But if it doesn't work, you might need to go the extra mile.
  • Create two separate pages with different content on them, each one focused on a different term. This is far more expensive, but in the end, it is safer and much more likely to work, not only for search engines but for conversions. It is likely that the people searching for one term rather than the other are driven by different needs and wants, so different language optimized for them will probably convert better as well as rank better.

So, in Andy's case, consider the possibility that those searching for "management training" might be looking for something somewhat different than "leadership development" searchers, even though Andy thinks of them as interchangeable. Perhaps those looking for management training are new to line management while those looking for leadership development are team leaders who are not yet managers. Or perhaps the leadership folks are long-time managers that just became executives.

Regardless, you give up a lot of marketing segmentation information when you assume that people using different keywords are the same just because you have a single product that helps them both. It is likely that subtle differences in copy that emphasize different benefits might indeed be called for. By lumping all these searchers into one category, you miss the chance to test what will optimize your conversions for each segment.

Often people feel trapped into duplicate content because their landing page contains the description for their product when it could spend more time on the problem. Focusing landing pages on the somewhat different ideas of management training and leadership development allows you to have each of those pages link to a common page that discusses your offering for both of those problems. You can still have one place that describes your product but with different landing pages for different keywords.

So, if you feel as though duplicate content makes sense in your situation, ask yourself whether you might be spending more time trying to do things cheaply rather than well, when that approach is itself wasting your time. Also, consider whether the added conversions might easily pay for the additional content costs. Focus your pages on the way that you attract people to the problem. That way you can link to a common page that describes the solution. That will reduce your impulse for duplicate content.

Enhanced by Zemanta

September 8, 2010





Mike is an expert in search marketing, search technology, social media, publishing, text analytics, and web metrics, who regularly makes speaking appearances.

Mike's previous appearances include Text Analytics World, Rutgers Business School, SEMRush webinar, ClickZ Live.

Mike also founded and writes for Biznology, is the co-author of Outside-In Marketing (with James Mathewson) and the best-selling Search Engine Marketing, Inc. (now in its 3rd edition, and sole author of Do It Wrong Quickly, named by the Miami Herald as one of the 11 best business books of 2007.






Comments(8)

There is another option but the answer may not be to your liking. I played around with the duplicate content algorithm a while back and learned that if you change paragraph structure on a page, utilize different images and different alt tags and then increase keyword density, you can have essentially the same content but laid out in a different way and you will not count as duplicate content.
Consider how Google looks at News sources. Most stories start from the AP and then are re-purposed by another news source and displayed to the world. Well with thousands of newspapers writing off of the same AP release, you are bound to run into situations that would be duplicate content, granted not very many. Well if your paragraphs are re-structured, it still makes sense semantically, and your images are different with different alt tags that are keyword rich, you could optimize two pages that are basically identical but do not qualify.

Hi Chris,

I am certain that there are ways to game the algorithm so that you can do less work and sneak through, but I wonder if all of that effort would be better applied to the work I laid out, because you are just waiting for a small change to the algorithm to undo what you've created. To me, what makes the most sense is really going after individual keyword phrases with individual pages, but I have no objection to other approaches. To me, it isn't worth the risk or the disruption when the rules shift, as they invariably do.

Hi,

Nice post to discuss in here. I have always been worried of duplicate content. I have two questions in my mind.

1) How to stop our own website content from being copied by other webmasters?

For eg. I am blogger and do blog about SEO and other internet marketing stuff. If someone likes my post and post it to their blog, the whose content will be considered as original and whose duplicate? Is there any way to tell search engines that mine is the original content.

2) Now as discussed in the post by Andy, what are the duplicate content guidelines for duplicate content on the same domain. I am not talking about rankings here. Does Google or other search engines black list that page or website anyhow? Because in blogs you can see we have the same post in the post url, category page and archives page.

Hi Rajeesh,

There isn't any way to prevent people from copying and pasting your content. It happens to me all the time. You can set a Google Alert on the keywords in your post to find out when it happens, but the kind of person who would plagiarize your piece is unlikely to respond favorably when you catch them. So, you might have to resign yourself to stealing going on. Usually the search engine can tell which post was first and they will usually show that one/ If they don't, you can appeal to the search engine to show yours instead of the copied one and they are almost always responsive. This article explains how to appeal: http://www.searchengineguide.com/mike-moran/how-do-i-avoid-the-duplicate-content-pen.php

When pages are duplicated on a site, Google shows just one of them. It does not blacklist your site. However, if your site has scads of duplicated content across the site, that could mark your site as being of lower quality and saddle it with lower search rankings. No one knows exactly what is in the ranking algorithms, but it is a fair guess that lots of duplicate content could be harmful to your site's ranking. Blog sites are not an issue here--the search engine usually tries to show the post archive page, as it should. That kind of duplicate content won't ever harm your search results.

Hey Mike,

Thanks for the reply. I got your point as Google will serve on first come basis. What is your opinion regarding Copyscape.

Does it help any how?

Hi Rajeesh,

Copyscape is a well-regarded program that can help content owners determine if their content has been plagiarized. It's been used in court cases and other proceedings to establish copyright violations, so if that is what you need, I have heard good things about it, although I haven't used it myself.

Thanks Mike,

Your responses were helpful.. Keep on posting such information..

Well, as you are saying duplicate content can be penalized by search engines, but using the same text content with different media content related to the subject, can save your page of having duplicate content. I have seen some codes in google searchs that can prevent copy and paste, and other stuff, but i dont remember the link right now. If i remember it, and i find it again i will post it here.

Comments closed after 30 days to combat spam.


Search Engine Guide > Mike Moran > Is duplicate content OK for duplicate keywords?