Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Duplicate Content

What is duplicate content? Duplicate content occurs when there is the exact same (or very similar) content appearing in multiple places on a website.

There are several SEO issues that can occur when a website has duplicate content, including crawl budget issues, search engine indexing issues, index bloat, keyword cannibalization, and canonical tag issues.

Our SEO Office Hours recaps below compile best practices Google has recommended for websites dealing with duplicate content issues.

(See Lumar’s full guide to duplicate content for even more actionable tips on how SEOs can address duplicate content issues.)

For even more on website content best practices for SEO, read our Guide to Optimizing Website Content for Search — or explore our Website Intelligence Academy resources on SEO & Content.

Increased Crawl Rates Can Be Caused by Authority, Server Performance or Duplicate Content

If you experience a temporarily high crawl rate, it might be caused because of an increase in authority, or Google thinks the server can handle an increased load, or they might be finding a lot of duplicate content caused by things like URL parameters.

8 Jul 2016

Copied Content Can Outrank the Original Source

If someone copies content from your site, they might rank above you, for example when the original page is missing a good title tag and the copy provides more context.

8 Jul 2016

No Penalty for Links to Duplicate Content

Linking to sites with duplicate content won’t ‘usually’ result in a penalty.

1 Jul 2016

Google May Rank Alternative Sites Hosting Duplicate Content

Duplicating your content on other websites can result in those pages ranking equally or better if Google thinks it makes more sense.

20 May 2016

Robots.txt Overrides Parameter Settings

URL Parameter settings in Search Console are a hint for Google, and they will validate them periodically. The Robots.txt disallow overrides the parameter removal, so it’s better to use the parameter tool to consolidate duplicate pages instead of disallow.

17 May 2016

AJAX is not 100% equivalent to normal static HTML pages, but it should be possible to make pages rank as well. Avoid using fragments in URLs, and check with Fetch and Render.

6 May 2016

Google Will Rewrite Keyword Stuffed and Duplicate Titles

Google will rewrite title tags which are keyword stuffed, or are duplicated across a lot of pages, and also to make them appropriate for the device.

6 May 2016

Duplicate Content Snippets are Credited to the Most Relevant Version

If you have pages with duplicated snippets of content, such as product descriptions, Google will show the page which it thinks is the most relevant version.

8 Mar 2016

Google Establishes Original Content Sources

If multiple sites are using the same affiliate/product content, Google will try to establish the original source, and allow a few other version to rank. They will be treated as separate sites which can rank independently for different queries. The suggestion is that this is fine provided you include some unique content. Later he also suggests this for pages with embedded videos.

23 Feb 2016

Pages with Duplicate Content Can Rank

Pages with content duplicated on other sites can still rank if they have something unique.

23 Feb 2016

Back 7/8 Next