Home SEO

Fix Duplicate Contents On Your Site

Author : | Category : SEO | Last Updated :

Many bloggers, especially the new one’s create duplicate content within their own site. For those who already have a site with a good number of internal pages, it can be a  total challenging, as the site will have updates.

Over time the sites end up accumulating pages and duplicate content URLs that are connected to the same content. Owning duplicate content on a site is no problem, but probably will hinder the tracking and indexing of those pages. If you go to see the PageRank can spread through the pages that are not recognized as duplicated and this can cause your site to be affected in terms of positioning on the results presented by search engines.

Tips to fix the duplicate content on your site:

  • Recognize what is duplicate content:

Recognize the duplicate content on your site is the most important. The easiest way in my opinion to accomplish this task is to select a bit of text from one of your pages and perform a search, but do not forget to limit results to pages within your site, using the “site: “. If you get various results for the same content, then you can start looking at those same pages.

  • Determine the preferred URL:

Before starting to solve any problem of duplicate content, you should determine your preferred URL and answer the simple question: What is the URL that you prefer for this content?

  • Be consistent within your site:

Once you have chosen your preferred URL, you will have to make sure that you are using it in all places within your site (including in the Sitemap).

  • Using the 301 permanent redirect, where possible and necessary:

Duplicate URLs redirect to the preferred URL  can help users and search engines to find the URL if they preferred to visit the duplicate URLs, the best way to do this is to use the 301 redirect.

  • Apply the attribute rel = “canonical”

When you can not do the 301 redirect, the attribute rel = “canonical” will help search engines understand your site better and their preferred URLs. In using this connection will be to standardize the research site to other search engines beyond Google, like Ask.com, Yahoo! and Bing.

  • Use the tool on Google WebmasterTools parameters:

    www.esoftload.info/ebooks/paid/webmasters/google.html
    www.esoftload.info/ebooks/webmasters/google?det=top10
    www.esoftload.info/shop/index.php?product_id =4&highlight=google+top10&cat_id=4&sessionid=233=1231&affid

If your duplicate content is coming from URL query parameters, the famous tool for Google Webmaster, tracked the id advise what  parameters are important and which are irrelevant to your URL. If you have an online shop of eBooks and products is a leading eBooks for Webmasters, the product page can be accessed by different URLs, like session IDs, among others:

Therefore if the search engines know that these pages have the same content, will index only one version to the search results. But through the Google tool can be ignored thereby reducing duplicate content on your site.

If you want the parameter sessionid ignored, Google will treat “www . esoftload . info / ebooks / paid / webmasters / google.html? sessionid = 27529”  same as “www . esoftload . info / ebooks / paid / webmasters / google.html.”

  • Attention to robots.txt?

I do not recommend blocking access to the duplicate content on your site through file robots.txt or other ways to go around. In this particular case, to block access to robots instead of using the attribute rel = “canonical”, the 301 redirects, search engines will treat these as separate URLs, or as simple pages and URLs are not aware that different for the same content.

It is preferred to be tracked, and are treated as duplicates, using any of the methods recommended here. Why allow these URLs to be crawled, the robot will learn the rules for identification of duplicate content to look at the new URL avoiding unnecessary screening.

If duplicate content is to make search engines crawl your site in Google Webmaster Tools you can adjust the rate of screening in order to track the maximum number of pages in your every visit, without any overhead to your server.

Attention : Google new algorithm is very strict on duplicate contents. So, get it fix soon to avoid penalization  by google.

Loading comments...

Digitalocean Banner