[Discussion] Duplicate Content and SEO

What is duplicate content? And why is it bad for SEO? Get all the info right here.

It’s long been a known fact that websites shouldn’t have duplicate content, but the reasons as to why and how it can affect your SEO rankings are sometimes quite vague.

You should have a clear understanding as to why original content is so valuable can help you develop a better content strategy without the disadvantages that identical copy can bring to your site.

According to Google:

“a duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin.”

Read more: https://support.google.com/webmasters/answer/66359?hl=en

TYPES OF DUPLICATE CONTENT:

  1. Internal duplicate content is when your website creates duplicate content through multiple internal URLs (on the same website).
  2. External duplicate content, also known as cross-domain duplicates, occurs when two or more different websites have the same page copy indexed by the search engines.

METHODS OF IDENTIFYING DUPLICATE CONTENT:

There are many tools and methods available to help you identify any content that could pose a problem.

  • CopyScape
  • Siteliner
  • SEMRush

You can also manually search for your content using Google search. You can copy a small excerpt of text from your article, and then enter it into the search bar to see if similar results show up.

While it’s more time consuming, it’s an effective way of identifying duplicate content elsewhere on the web.

FIXING DUPLICATE CONTENT ON YOUR SITE:

There are various ways to fix SEO issues related to duplicate content

  1. 301 redirects and Rel=”canonical” tags
  2. Rel=”prev” and rel=”next” for pagination pages
  3. DMCA (Copyright) Complaint to Google
  4. Effective content marketing strategy

Google tried to kill off the duplicate content issue years ago with Google Panda update.

Google’s Panda algorithm update is a search filter introduced in February 2011 meant to stop sites with poor quality content from working their way into Google’s top search results. Panda is updated from time-to-time. And sites previously hit may escape, if they’ve made the right changes.

WILL HAVING DUPLICATE CONTENT AFFECT MY SITE?

It’s best to avoid duplicate content not necessarily because the penalty is as scary as many search engine myths about duplicate content say, but because it works against your search engine and online business success by keeping your pages from getting good rankings, or even showing up in search results.

WILL REPUBLISHING MY EXISTING CONTENT AFFECT MY SITE (FROM GUEST BLOGGING, ARTICLE SUBMISSION, WEB 2.0 SITES — LINKEDIN, MEDIUM ETC.)?

Googlebot visits million sites every day. If it finds a copied content version of something a week later on another site, it knows where the original appeared. Googlebot doesn’t get angry and penalize. It moves on.

Resusing the first paragraph of your blog post is fine, but when you start re-posting a few paragraphs or more, you are asking for trouble.

When there are too many duplicate pages on your own site, you end up diluting your ability to rank for that particular page by forcing those pages to compete against one another in the search engines.

Its always better to re-write the content before publishing, or post the content on the other site only. As sites such as Medium and LinkedIn will promote your article to their own community members if your article is worthy enough.

Republishing your content won’t help you generate much more traffic than you already have. I’ve tested this scenario with authority sites that get over million visitors a month. It’s very rare that they ever drive more than 1–10% visitors from a repost. (as an SEO’er we mostly we look into acquiring links from these authority sites)

There maybe a few scenarios where the duplicate content will outrank the original:

1) When the duplicate adds more value to the content vs the original. The author might add graphs, data, images, video, etc to cover the subject with more depth and clarity.

2) The authority of the website where the duplicate is found is much higher than the website where the original is posted.

3) Original content isn’t optimised for SEO. No clear title tag, H1 (heading tags), slow loading speed, not mobile responsive, etc.

A SIDE NOTE:

Yesterday, I answered a similar query about republihing content from LinkedIn to own blog/website on one of the DM community.

Later I noticed one senior SEO marketer commented “Just think bots are blind so as google 😁”, backing with statement that he had personally tested the scenario.

I understand every marketer has his own perspective, style, tricks and strategy. But my concern is blindly conveying a wrong or incomplete message to newbies.

I had switched my focus from Quora answers to Facebook communities for almost 3yrs now, as I feel it has more authenticity. (apart from Inbound.org & GrowtHackers.com)

I also see many senior Digital Marketers experts/gurus, gets irritated when asked repeated query about same topics on SEO, or in general.

If you teach DM to people and consider yourself a teacher than its your responsibility to answer your followers, students and aspiring marketers. Or atleast point them to the right direction.

The industry is already facing a down fall due to false and fake gurus, and urge each one to stop blindly follow anyone.

And if possible, share your original experience with the marketers/agencies webinars, courses, institute, offline events and meetups. So others can be aware of it too.

Hope this helps.

--

--

Roshan Ambler | Digital Marketing Practitioner

Founder of @GoLeadDigital. I live and breathe digital marketing. I hope you enjoy my findings.