Relaunch SEO checklist – 6 things you should definitely consider
At best, a professionally implemented website relaunch brings significant advantages, such as a new design , a better conversion rate, better prepared content, better usability, responsive adjustments, better loading times, etc stand in the way of a website's success.
The relaunch of a website is also a technical SEO challenge in the life cycle of a website that should not be underestimated. All of the search engine optimization efforts of recent years can be affected by a relaunch. Common problems, such as a non-existent or declining page ranking and the associated poor visibility of the website, can have a serious impact on organic traffic and thus cause a loss of potential new customers and thus sales.
Our 6 tips will help you avoid the biggest mistakes when relaunching a website from an SEO perspective:
1. Do not relaunch without checking indexing
If a website is completely rebuilt or rebuilt, the findability of the pages under development and their content should usually be excluded from inclusion in the search index of a search engine. This exclusion can be implemented using the “ noindex ” meta tag in the HTML code or in the HTTP response ( noindex header using the X-Robots tag ).
A brief exemplary explanation:
- Noindex <meta> tag example in header <head> (Exclusion from the search service using “robots” meta tag, which can continue to define directives):
- <meta name="robots" content="noindex" />
- Noindex <meta> tag example in header <head> (Exclusion from the Google web crawler):
- <meta name="googlebot" content="noindex" />
- IMPORTANT: Some web crawlers interpret the "noindex" statement differently, so it is possible that the pages are still listed in the results of other search engines
- HTTP response header using X-Robots tag
- Header set X-Robots-Tag “noindex”
- IMPORTANT: The main difference between the X-Robots tag and the robots meta tag is that it can implement much more comprehensive measures with less effort. Robots meta tags are more suitable for quick and small actions page by page. The X-Robots Day, on the other hand, for large and extensive measures. X-Robots tags are implemented in the httpd.conf or the .htaccess file (on the server side).
In this respect, when the new page or new components/content go live, it should always be checked whether all pages are indexed and can therefore be found by a search engine.
2. Adopt meta snippets after relaunch and adjust relevant content
Meta snippets or meta data , such as the page title (Meta Title) and the page description (Meta Description), are important ranking factors . Rankings can be severely affected if meta data is not transferred after a relaunch and optimized for the new page content. Meta snippets that are not optimally selected from the old stock of a revised page are often better than an “empty” snippet.
Despite correct meta data after a relaunch, rankings can be influenced if, for example, the content does not have the same keyword relevance. Content that already ranked well before the relaunch should not be deleted afterwards. In order to find the relevant content in advance , an inventory of all content , a so-called “content audit”, should be created. The relevant content recorded in the process should then definitely be included in a relaunch.
3. Check redirects
If a website is to go online, or an existing website has been rebuilt/renewed/supplemented, redirects from old pages to new pages are often forgotten . This happens very often when switching from http to https. Likewise, 301 redirects are very often overlooked . The SEO website relaunch thus becomes a ranking disaster, since duplicate content is created, which is in competition between the old and new site. If the old site is deleted without specifying a redirect to the new site, valuable backlinks from the old site will be lost .
Possibilities of prevention and aftercare:
- Incorrect or misplaced canonicals (Tags and URLs)
- Only helpful if these are set correctly and the old page links to the new one. Likewise, no canonicals should be played out on the test environment after going live, but always compared and replaced with the final URLs.
- 301 status code redirects
- These inform the Internet search engines that the old content is no longer available, but that new content alternatively exists. 404 status codes can also be prevented with a 301 redirect.
- Exclude incorrectly set up redirects
- There are scenarios where a 302 status code (temporary) redirect is accidentally set up instead of a 301 status code redirect. This should be avoided, otherwise the search services will think that the old site will eventually be available again and was only deactivated for a short time or is being maintained.
- A forwarding chain has been created, with each forwarding being evaluated by the search engine crawler as a separate page view, which then produces as many views as necessary until the new appropriate page can be reached. The consequences are slower loading speeds and possibly a negative impact on the budget planned for crawling the site.
- The link destination is inappropriate.
- 301 Redirect only to the start page: If a redirect goes directly to the corresponding new subpages, they benefit from it and the visibility in the Internet search services does not decrease. Of course, the redirects should be thematically appropriate!
4. Set internal links correctly and delete links from the test environment
A good structure and an optimally used, internal linking to subpages is also essential to be better rated by search engines. This link relevance of the subpages is often discarded after a website relaunch and various internal links are sometimes incorrect or no longer set. In this respect, it is important to check all internal links before a relaunch .
Likewise, links (internal and external) from the test environment can lead to problems when going live if they were set as a test and thus miss their actual purpose or goal. Exclusion from crawling is of great advantage here in advance on the test environment , especially if the test environment is running on a subdomain or a local instance. In this respect, all URLs and links must also be checked here, edited if necessary and the crawling activated afterwards.
5. Exclude pages with no content after website relaunch
When a website is relaunched, new content is often integrated on the pages or even completely new pages with new content are created. Since the process of creating the content is usually a time-consuming undertaking, placeholder pages and texts are often created, eg with “Lorem Ipsum” content. When going live, these pages are unintentionally taken over with the placeholder texts and rated as “ thin content ” by the search engines. Since this can lead to considerable damage, it must not be forgotten to exclude these pages from indexing .
6. SSL Certificates: Solving URL Problems
In general, all changes that affect SSL encryption should be clarified in advance. The following should be checked and, if necessary, edited before the new website is relaunched:
- Certificate active
- Certificate error free
- Relocation of a domain to another web space in the course of the relaunch: certificate booked and active
- Check links for https abbreviations
Conclusion
The SEO relaunch can become a technical challenge and this does not only apply to products hosted on the German market. It takes a lot of know-how to rule out mistakes that Google and other search engines don't forgive. The components of technical SEO as well as organic SEO are often the biggest and most complex challenges, in addition to the technical effort involved in moving. A website relaunch should always be professionally planned , structured, revised and implemented so that costs are saved and the restart is safe, optimized and successful. In addition to some other important steps, our tips from this article provide an essential excerpt of what needs to be considered when relaunching with regard to SEO.
Our SEO professionals from online marketing, technology and design are also happy to support you with a website relaunch.
On October 14, the Jetpack Scan team published a recently discovered SQL injection bug. If exploited, this bug can give attackers access to sensitive information from the database, such as usernames and encrypted passwords. A prerequisite for exploiting the vulnerability is the use of the Classic Editor plugin in the backend (installed and enabled).