Nov30

Six Reasons Why Duplicate Content is Bad for SEO

Posted by Jake Neeley

SEO Duplicate Content

Duplicate content is one of the top concerns of online publishers and content marketers. In the SEO world, it’s the same content published on different URLs and may negatively affect a website ranking.

“Duplicate content is content that appears on the Internet in more than one place (URL),” said SEOMoz. “This is a problem because when there are more than one piece of identical content on the Internet, it is difficult for search engines to decide which version is more relevant to a given search query.”

Is duplicate content bad for your brand? Many believe it is, including Google, who advises sites to carefully consider the reasons for publishing duplicate content. The following 7 points may help Midphase hosting customers interested in protecting their Google search rankings.

  1. Negative User Experience
  2. Google’s main goal is to provide relevant search results to users. Duplicate content creates a negative user experience because they are directed to the same content inadvertently. This is a waste of time for them because readers want to consume fresh content that matches their query.

  3. Affects Search Engine Rankings
  4. “Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users,” said Google.

    “We’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.”

  5. Affects Brand Credibility
  6. Readers share high-quality content that resonates with them. Publishing duplicate content across domains affects your brand credibility and integrity as a thought leader in your niche.

  7. Search Engines Don’t Know Which Page[s] to Exclude
  8. “If you re-publish your blog content on an article submission site, you’re creating duplicate content that the search engines need to filter through in order to find the original source,” said Scott Benson, Vocus.

    “The engines are trying to determine which version is the original, and therefore should rank highest. You don’t want to make things harder for them.”

  9. Difficult for Direct Link Metrics
  10. Search engines find it difficult to direct the link metrics (page rank, trust, anchor texts, etc.) of your site to a single page if there is a duplicate content across domains. It’s highly recommended to optimize your website and publish fresh content to attract new visitors.

    New visitors will then begin to share links to your blog via social media or outbound links to their site. These signal the search engines that your blog provides valuable content to the readers.

    “As these signals increase, the engines will assign authority to you blog,” said Benson. “This authority is passed back into your core site, increasing your domain strength – a major ranking signal.”

  11. Traffic Decreases
  12. When duplicate content is present, site owners may suffer lower rankings and consequently lower traffic, to less relevant results on search engines according to SEOMoz.

Overall, content is king. It’s best to write for humans instead of machines and Google is encouraging this approach more than ever.

If you like it, share it!

About Jake Neeley

Jake Neeley is a content marketing and social media geek who loves reading, outdoor sports (especially those in Utah mountains), and time with his family. Connect with Jake on Google+, Twitter, and LinkedIn.

2 Comments

2 Responses to “Six Reasons Why Duplicate Content is Bad for SEO”

  • LauraDecember 2nd, 2012 at 9:17 AM

    I agree Jake, duplicate content can be cause for concern, I use copyscape to locate duplicate articles online. What tool do you recommend?

Stop blending in with the rest of the crowd and start leaving your mark on the web