Any major updates or changes to your website should always be made with SEO considerations firmly in mind. From robots.txt errors, to poorly implemented page redirects, it’s all too easy to see your search rankings vanish – quite literally – overnight.
It’s a large and complex issue with a lot of technical details to keep in mind throughout your website migration. But, there are some common mistakes to avoid as well as some overarching principles that can help to guide you in the right direction.
Preserve important URLs
This is especially true of Single Page Applications (SPAs) which need special treatment to ensure that any important URLs from the former website are preserved and remain visible to Googlebot, to protect your existing presence in the SERPs.
Don’t be fooled if your homepage is still appearing in its rightful ranking – this could be a sign that Google is successfully crawling the homepage of your SPA but is failing to find the content served dynamically client-side.
Enable crawler access
This is about more than just unblocking the search engine crawlers in your robots.txt file. You need them to be able to crawl your site and see your content – and for that, it’s likely you’ll need some form of server-side rendering.
This not only makes each page visible to the crawlers, but it can also increase the number of pages and the levels of your website hierarchy that get indexed, by placing less demand on the robots’ crawl budget per page.
By keeping in mind what the robots can actually do, you can ensure that your content is visible to them within those capabilities. As a reward, more of your content is likely to be crawled, indexed and ranked.
[Case Study] Increase visibility by improving website crawlability for Googlebot
Redirect old URLs
If you do not plan to preserve old URLs exactly as they are, but you have a strong search presence, consider implementing permanent redirects to point the robots to the most closely matching equivalent within your new sitemap.
This preserves domain authority and can protect your rankings, at least until any newly published content is fully crawled and indexed, at which point you may or may not decide to remove the older redirects, if those old pages are no longer bringing in significant traffic.
Use consistent internal links
For SEO, it’s better to keep content relatively static and load a different URL when substantially changing the page. The crawlers understand this approach better, can map your website better as a whole, and are more likely to visit and correctly interpret the new page.
Poor discipline on internal linking is a major pitfall of many migrations. It can leave some pages much harder for the robots to crawl and sends confusing signals about the relative importance of some pages, compared with others.
The importance of consistency
Examples of this can include resolving URLs with and without a trailing slash (whether this resolves or redirects should be consistent across all areas of your website), as well as SEO best practices like implementing canonical tags (which again should be consistent in terms of which content is deemed canonical).
[Case Study] Keep your redesign from penalizing your SEO
Will this ever change?
Dynamic rendering is no friend to SEO because of the relatively simple nature of the search robots. In the future, we may see a more complex and capable Googlebot that can work around some of these issues, but for now, the onus is on webmasters to serve up content in the most edible way possible.
By keeping in mind some of the pitfalls and best practices outlined above, you can keep your SEO ambitions at the forefront of your website migration and avoid the nightmare scenario of having a large, profitable website vanish from the SERPs overnight.