SEO Safe Site Relaunch

What is important for a website relaunch? Especially set up redirects when URLs change. True, but even if all redirects are correct, there may be a deindexing of the new page, which means nothing else than: no URLs in the index = no rankings and no traffic. This aspect is therefore also very important and should definitely be taken into account. To avoid this worst-case scenario, I have asked an SEO agency to explain three scenarios where deindexing can occur and explain how to avoid the problem.

Lock the new page with robots.txt

An introduction to the structure and the possible contents of a robots.txt can be found here as a detailed description under robots.txt – What is it and how do I use it?

A test environment (or sandbox) can be protected (alternatively or in addition to a password) against indexing by robots.txt. This is not yet desired in a test environment because, firstly, the contents are often at least partially identical to the original page (duplicate content SEO issue) and, secondly, no unfinished page should get into the index.

A possible variant to protect the test environment is the following code in robots.txt:

  • User agent: *
  • Disallow: //

All search engine crawlers are not allowed access to the entire domain.

Avoidance of the SEO problem

The code that excludes indexing of the test environment must be adjusted during the live page launch so that no URLs are excluded from crawling on the new Web page. In our example, the code would change to :

  • User agent: *
  • Allow: /

Another way to protect a test environment from indexing is to set all URLs to noindex. If a new page is launched and the non-indexing setting is not removed, the page will not be indexed and therefore cannot generate any rankings or traffic.

The noindex markup must be removed with the relaunch to be SEO safe. To check whether this is the case everywhere, you can crawl the new page and check for exactly this point. This can be done for example with the extraction function in Screaming Frog. Alternatively, there is also an extra column for Meta Robots, where this information can also be read.

Deindexation by forwarding circuits

At a relaunch you should also take the opportunity to change the site to https. Our article https-Switch in 7 steps explains what you have to consider.

In addition, you must ensure that all http URLs are forwarded to the corresponding https URLs using a forwarding rule, but that there is no other rule that returns the https URLs to the old http URLs.

And this leads to the fact that the search engines follow the redirections in the circle, find no goal and it comes to a deindexing. This is not only possible when switching to https, but also for individual URLs, i.e. when a source URL is redirected to a target URL and then back to the source URL. This issue is a biggie, so I’ve asked Who’s Talkin to explain to me what to do in such a scenario. These guys even do SEO for plastic surgeons, so they know what they are talking about.

Answer: After the relaunch, the redirects must be checked. Here a special attention should be paid to the fact that there are no forwarding chains and certainly no forwarding circles, since these can lead to a fast deindexing of the page. Redirections may only lead from the old URL to the new URL.

Bottom line

The topic of indexing must always be taken into account during a relaunch. The best website is worthless if it doesn’t get into the index of search engines and you don’t get any organic traffic. In order to avoid problems with indexing, the following points must be examined:

  • txt
  • index / noindex Meta Tags
  • possible chains of communication