Home SEO

Fix Duplicate Contents On Your Site

Author : | Category : SEO | Last Updated :

Many bloggers, especially the new one’s create duplicate content within their own site. For those who already have a site with a good number of internal pages, it can be a  total challenging, as the site will have updates.

Over time the sites end up accumulating pages and duplicate content URLs that are connected to the same content. Owning duplicate content on a site is no problem, but probably will hinder the tracking and indexing of those pages. If you go to see the PageRank can spread through the pages that are not recognized as duplicated and this can cause your site to be affected in terms of positioning on the results presented by search engines.

Tips to fix the duplicate content on your site:

  • Recognize what is duplicate content:

Recognize the duplicate content on your site is the most important. The easiest way in my opinion to accomplish this task is to select a bit of text from one of your pages and perform a search, but do not forget to limit results to pages within your site, using the “site: “. If you get various results for the same content, then you can start looking at those same pages.

  • Determine the preferred URL:

Before starting to solve any problem of duplicate content, you should determine your preferred URL and answer the simple question: What is the URL that you prefer for this content?

  • Be consistent within your site:

Once you have chosen your preferred URL, you will have to make sure that you are using it in all places within your site (including in the Sitemap).

  • Using the 301 permanent redirect, where possible and necessary:

Duplicate URLs redirect to the preferred URL  can help users and search engines to find the URL if they preferred to visit the duplicate URLs, the best way to do this is to use the 301 redirect.

  • Apply the attribute rel = “canonical”

When you can not do the 301 redirect, the attribute rel = “canonical” will help search engines understand your site better and their preferred URLs. In using this connection will be to standardize the research site to other search engines beyond Google, like Ask.com, Yahoo! and Bing.

  • Use the tool on Google WebmasterTools parameters:

    www.esoftload.info/ebooks/paid/webmasters/google.html
    www.esoftload.info/ebooks/webmasters/google?det=top10
    www.esoftload.info/shop/index.php?product_id =4&highlight=google+top10&cat_id=4&sessionid=233=1231&affid

If your duplicate content is coming from URL query parameters, the famous tool for Google Webmaster, tracked the id advise what  parameters are important and which are irrelevant to your URL. If you have an online shop of eBooks and products is a leading eBooks for Webmasters, the product page can be accessed by different URLs, like session IDs, among others:

Therefore if the search engines know that these pages have the same content, will index only one version to the search results. But through the Google tool can be ignored thereby reducing duplicate content on your site.

If you want the parameter sessionid ignored, Google will treat “www . esoftload . info / ebooks / paid / webmasters / google.html? sessionid = 27529”  same as “www . esoftload . info / ebooks / paid / webmasters / google.html.”

  • Attention to robots.txt?

I do not recommend blocking access to the duplicate content on your site through file robots.txt or other ways to go around. In this particular case, to block access to robots instead of using the attribute rel = “canonical”, the 301 redirects, search engines will treat these as separate URLs, or as simple pages and URLs are not aware that different for the same content.

It is preferred to be tracked, and are treated as duplicates, using any of the methods recommended here. Why allow these URLs to be crawled, the robot will learn the rules for identification of duplicate content to look at the new URL avoiding unnecessary screening.

If duplicate content is to make search engines crawl your site in Google Webmaster Tools you can adjust the rate of screening in order to track the maximum number of pages in your every visit, without any overhead to your server.

Attention : Google new algorithm is very strict on duplicate contents. So, get it fix soon to avoid penalization  by google.

18 thoughts on “Fix Duplicate Contents On Your Site

  1. Oops !
    I would say “AVOID DUPLICATE CONTENT”. Blog needs a unique content that is not found on any other sites. That gives lots of advantages like good rank in search engine, good google crawl rate and good traffic too

    So better avoid copying content. If you are already having some duplicate contents then make use of the tips mentioned above in this post to fix them 🙂

    1. New google algorithm is very strict for duplicate contents. so, better to avoid to rank good in google. thanks for your recommendation Praveen.

  2. These are good points. It’s always difficult to avoid duplicate content when you have a headline page on your site. It’s usually the landing page and it may contain headlines and 100 word introductions to other articles on your sites. I change the wording of the introductions, so that the wording for the introduction under the headline is different to the wording of the introduction to the 500 word article on an inside web page.

  3. Oh! Thanks Isha for the warning! Will fix it soon!

  4. Having duplicate contents is the biggest problem and we need to careful about it. Here I would like to share my personal experience, Once time i have changed my permalink and because of this i got hundreds of duplicate contents in my blog. So here is my suggestion that Before changing permalink consider some redirect plug-in and above tips.

    1. Thank you Rakesh for sharing your personal experience with us.

  5. I agree keeping the robots.txt file up to date helps search engines rank a website better

  6. Avoiding duplicate may help google stop penalizing websites by granting low rankings

  7. Avoiding duplicate content is very important for maintaining site quality. Very useful information for new webmasters.

  8. Good points to avoid and fix Duplicate contents. Nice post.

  9. in my opinion if you want your site to perform good then avoid duplicate content in your site.
    you can always create site for the things you are passionate about, coz all content will be provide by you in your own word, so theres less chance of being duplicate. but if you have duplicate content in your site then do follow the above mentioned points.

    1. thanks for the comment savio..

  10. Great tips friend. This is really must for every webmaster. Thanks indeed.

  11. “Using the 301 permanent redirect, where possible and necessary”

    How on earth could I forget this one?! Thanks for updating my memorybank! 🙂

  12. Search engines penalize websites with duplicate content and i agree it is really important to update robots.txt file to facilitate google spider robot

  13. By default in WP you can find duplicate content in different archive pages (by day/mounth/year), tag pages, author pages. You can add specific tips for wordpress …

  14. Btw canonical was introduced by Google first. I did not know if others support that or not. From your posts looks like they do now.

  15. Nice information, thanks . Google is day by day getting stricter with duplicate contents & websites resorting to black hat SEO. To get your website indexed properly is not so easy nowadays.This blog post will help those business companies to some extent, who want to create their company website & promote them over internet.

Comments are closed.


Digitalocean Banner