Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Domain Migration

Check out our in-depth website migration checklist for a practical guide on managing a site migration project.

Domain migration is the process of moving a website from one domain to another and involves migrating all content and resources. There are several things that need to be considered to ensure the migration is successful and doesn’t affect a website’s performance in search. In our SEO Office Hours Notes below we cover these factors, commonly asked site migration questions, and best practice advice from Google.


Merging two websites together can be complex, but isn’t inherently ‘good’ or ‘bad’ for SEO

Merging two websites together essentially creates a new entity. Google must explore and review how this fits into the wider search landscape as it would with any other new site. 

Site migrations that involve merging content from multiple domains can be complicated, but it’s sometimes the only option. It’s therefore not a bad tactic, but extra care should be taken to ensure every step is well documented in case errors do occur.

28 Oct 2022

Use log file analysis to understand which older 404 pages may benefit from redirects

When doing a website migration, it’s important to make sure that important external links are redirected so that users don’t land on a 404 and the link is lost. This could also be reflected in the search results over time. John mentioned that a difference may not be noticed after a time period of two years, but that if there are really strong external links pointing to a 404 or broken link, it would still be worthwhile to redirect these pages. He clarified that you can further analyze this by using log files to see which older 404 pages search engines are still regularly trying to access. This could be a sign that they should be redirected to something more useful.

7 Apr 2022

Updating backlinks to a migrated domain helps with canonicalization

An attendee was talking about a website migration from domain A to domain B. They were setting up redirects, but asked whether the page authority and rankings would be negatively affected if there were many existing backlinks that point to domain A.

John replied that setting up redirects and using the Change of Address tool in Search Console will help Google understand the changes that have occurred during a site migration. However, he said that on a per-page basis they also try to look at canonicalization. When dealing with canonicalization on migrated domains, John said that redirects, internal links, and canonical tags play a role —- but external links also play a role. What could happen, if Google sees a lot of external links going to the old URL, is that they might index the old URL instead of the new one. This could be because they think the change might be temporary due to these linking signals. During site migrations, they recommend finding the larger websites linking to your previous domain and requesting that those backlinks are updated to make sure that they can align everything with the new domain.

21 Feb 2022

There’s generally no SEO benefit to repurposing an old or expired domain

When asked about using old, parked domains for new sites, John clarifies that users will still need to put the work in to get the site re-established. If the domain has been out of action for some time and comes back into focus with different content, there generally won’t be any SEO benefit to gain. In the same vein, it typically doesn’t make sense to buy expired domains if you’re only doing so in the hopes of a visibility boost. The amount of work needed to establish the site would be similar to using an entirely new domain.

6 Dec 2021

It can take years for crawling on migrated domains to be stopped completely

John confirmed that it takes a very long time (even years) for the Google systems to completely stop crawling a domain, even after they are redirected.

17 Nov 2021

Domain Redirects Should Remain in Place Permanently

Google recommends that domain redirects should remain in place for a significant amount of time, at least a year, but it’s better to keep it in place as long as possible whilst you’re still seeing users or bots accessing the old domain.

6 Mar 2020

Use GSC to Identify If There Are Any Errors With a Site’s URL Structure After a Migration

After completing a site migration, John recommends using GSC to compare the queries and positions the site was ranking for before and after the change. This will identify if there are any errors with Google’s understanding of the new URL structure and identify if the migration has impacted traffic to the site and where this has occurred.

31 Jan 2020

Keep Old Domain & 301 Redirects for as Long as Possible After Domain Migration

John recommends maintaining 301 redirects from an old domain to the new one for at least a year after migrating. However, users may still access the old domain years after the migration, so consider keeping the redirects for as long as possible. Also try to keep ownership of the old domain so spammers don’t misuse it.

7 Jan 2020

It is Normal For Old URLs To Still be Visible in Search Results After Performing a Domain Migration

When performing a domain migration, it’s perfectly normal for pages from the old domain to still appear in the search results. However, it is likely the cached page is already seen as the new domain URL, as Google will identify that the old URL is an alternate version of the new one. John does not recommend using the URL removal tool for this as it won’t provide a long-term fix, it will just hide the URLs in search results.

1 Nov 2019

Blocking Googlebot’s IP is The Best Way to Prevent Google From Crawling Your Site While Allowing Other Tools to Access It

If you want to block Googlebot from crawling a staging site, but want to allow other crawling tools access, John recommends whitelisting the IPs of the users and tools you need to view the site but disallowing Googlebot. This is because Google may crawl pages they find on a site, even if they have a noindex tag, or index pages without crawling them, even if they are blocked in robots.txt.

1 Oct 2019

Back 1/6 Next