SEO considerations when migrating to a JavaScript framework

April 19, 2022 - 5  min reading time - by Dan Taylor
Home > Technical SEO > SEO considerations for JavaScript

Any major updates or changes to your website should always be made with SEO considerations firmly in mind. From robots.txt errors, to poorly implemented page redirects, it’s all too easy to see your search rankings vanish – quite literally – overnight.

This is equally true when migrating your website to a JavaScript framework. With some careful planning, you can make sure your site remains visible to the search engine robots, avoid the common risks and pitfalls that lead to traffic loss, and prepare your site for continued future growth.

It’s a large and complex issue with a lot of technical details to keep in mind throughout your website migration. But, there are some common mistakes to avoid as well as some overarching principles that can help to guide you in the right direction.

Here are some of the top SEO concerns when migrating to a JavaScript framework.

Preserve important URLs

Googlebot and the other search engine crawlers associate website content with URLs – it’s how they link to you from their search results – but JavaScript frameworks can break the bond between static URLs and content by updating pages dynamically.

This is especially true of Single Page Applications (SPAs) which need special treatment to ensure that any important URLs from the former website are preserved and remain visible to Googlebot, to protect your existing presence in the SERPs.

Don’t be fooled if your homepage is still appearing in its rightful ranking – this could be a sign that Google is successfully crawling the homepage of your SPA but is failing to find the content served dynamically client-side.

Enable crawler access

This is about more than just unblocking the search engine crawlers in your robots.txt file. You need them to be able to crawl your site and see your content – and for that, it’s likely you’ll need some form of server-side rendering.

By implementing server-side rendering or pre-rendering, you give the search robots a version of your content as it appears after any JavaScripts have been executed, removing the resource burden of asking the robots to render the page content themselves.

This not only makes each page visible to the crawlers, but it can also increase the number of pages and the levels of your website hierarchy that get indexed, by placing less demand on the robots’ crawl budget per page.

Improve crawlability

You can give the robots even more of a helping hand by presenting information in an easy-to-digest way. Think about this when implementing JavaScript capabilities like onclick events and infinitely scrolling pagination.

By keeping in mind what the robots can actually do, you can ensure that your content is visible to them within those capabilities. As a reward, more of your content is likely to be crawled, indexed and ranked.

An added bonus is that by creating a crawler-friendly version of your website, you may also improve accessibility for some human visitors, who may not have JavaScript functioning on their device when they visit your site.

[Case Study] Increase visibility by improving website crawlability for Googlebot

Paris Match has faced a clear problematic: deeply auditing its site, identifying its strengths and weaknesses, determining its priorities and fixing the blocking factors for Google’s crawl. The SEO issues of Paris Match’s website are the commons ones of news sites.

Redirect old URLs

If you do not plan to preserve old URLs exactly as they are, but you have a strong search presence, consider implementing permanent redirects to point the robots to the most closely matching equivalent within your new sitemap.

This preserves domain authority and can protect your rankings, at least until any newly published content is fully crawled and indexed, at which point you may or may not decide to remove the older redirects, if those old pages are no longer bringing in significant traffic.

Use consistent internal links

Again, this is about dynamic loading of content. JavaScript frameworks can unleash some cool capabilities, such as the ability to pull in data from an external resource and use it to update the current page, rather than navigating to a different URL.

For SEO, it’s better to keep content relatively static and load a different URL when substantially changing the page. The crawlers understand this approach better, can map your website better as a whole, and are more likely to visit and correctly interpret the new page.

Poor discipline on internal linking is a major pitfall of many migrations. It can leave some pages much harder for the robots to crawl and sends confusing signals about the relative importance of some pages, compared with others.

The importance of consistency

Beyond internal links, try to be consistent about all the signals you send to the search engines. This is especially true if your website loads only some of its content via the JavaScript framework, as you should ensure the JavaScript pages load and function similarly to any HTML or PHP pages on your site.

Examples of this can include resolving URLs with and without a trailing slash (whether this resolves or redirects should be consistent across all areas of your website), as well as SEO best practices like implementing canonical tags (which again should be consistent in terms of which content is deemed canonical).

Prevention is better than cure, especially where search rankings are concerned. So, try to understand not only how content displays on your website, but how it comes to be displayed, whether that’s by client-side or server-side rendering – and whether your migration to JavaScript will affect the visibility of content on your website, the next time it is crawled.

[Case Study] Keep your redesign from penalizing your SEO

A year after their website’s redesign, EasyCash soon realized that the performance they had hoped for was not there. They identified and resolved several SEO roadblocks.

Will this ever change?

Dynamic rendering is no friend to SEO because of the relatively simple nature of the search robots. In the future, we may see a more complex and capable Googlebot that can work around some of these issues, but for now, the onus is on webmasters to serve up content in the most edible way possible.

This is true throughout the planning, implementation and management of any website that relies on a search presence. But it’s of paramount importance during major updates, upgrades and migrations, including migration to a JavaScript framework.

By keeping in mind some of the pitfalls and best practices outlined above, you can keep your SEO ambitions at the forefront of your website migration and avoid the nightmare scenario of having a large, profitable website vanish from the SERPs overnight.

Dan Taylor is the Head of Technical SEO at SALT.agency, an international bespoke SEO agency. He’s an experienced, multi-discipline digital marketing professional, predominantly specializing in SEO (technical and international). Dan has worked with a wide variety of companies, from LSE listed major British brands, to pioneering unicorn tech start-ups in San Francisco. Dan also speaks at SEO and tech conferences internationally, and writes for a number of industry blogs.
Related subjects: