SEO best practices for React e-commerce websites

April 5, 2022 - 5  min reading time - by Dan Taylor
Home > Technical SEO > SEO practices for React eCommerce websites

React has emerged as one of the leading JavaScript frameworks in use across a variety of different websites, and particularly on e-commerce sites.

Because of the way they are built, React websites have some specific considerations when it comes to search engine optimization (SEO), with a distinct set of SEO best practices for React Single Page Applications (SPAs).

In this guide we will look at how to ensure a React e-commerce site is correctly configured for SEO, and some of the pros and cons of using React on a search optimized e-commerce website.

What is React?

React is a JavaScript library which can be used to build user interfaces based on common UI components. It is maintained by Facebook’s parent company Meta, along with an open-source community of developers including companies and individuals.

Because it is open-source, React is also free to use. It can form the base framework of a single-page application (SPAs mentioned above) or a mobile app.

Why is React good for e-commerce?

React’s focus on UI means it is a useful tool to create smooth, seamless e-commerce websites that put the user experience (UX) at the heart of the design.

Content is served on the client side and pages do not need to refresh, making it faster and easier for customers to navigate a website, with fewer server delays to affect page loading speeds.

React’s versatility means it can be used across desktop e-commerce sites and mobile apps alike, making it one of the easiest options for developers who want to create Progressive Web Apps.

Can search engines crawl React websites?

As with any search optimized website, it’s important to understand how the robots (and especially Googlebot) see the content on a React site.

In general, Google crawls the website in two stages, identifying the content as it goes.

Crawl the source code

First, Googlebot will crawl the website and retrieve the source code, including the HTML, page headers and so on.

Render the DOM

Second, Googlebot renders the Domain Object Model (DOM) including any JavaScript used on the page – you can identify this using Chrome’s built-in Developer Tools and equivalent features in other browsers.

React is a client-side JavaScript framework, which means Googlebot may have difficulty identifying pages. This is because unlike a traditional website, React does not send requests to the server when navigating from one page to another – making it difficult for Google to see the different pages.

You can configure React to use server-side rendering, but Googlebot will still not render the JavaScript files and other server-based resources when navigating your site. Because of this, it’s important to set up your React e-commerce website for SEO in line with best practice.

Common SEO problems with React

There are several common SEO problems with React e-commerce websites:

Content not discovered / slow to index

Googlebot allocates a ‘render budget’ to each website it crawls and will typically leave once that budget has been spent. This ensures smaller websites get their share of the crawler’s attention and prevents infinite crawling loops, for example on websites that generate dynamic URLs during navigation.

Because React websites function as SPAs, the entire site must be rendered before all of its content can be crawled. This can delay the discovery of content and consumes extra resources on the crawler’s side, which may lead to fewer pages crawled and indexed.

Server-side rendering and pre-rendering can take some of the resource burden off of the robots, helping to improve the speed with which new pages appear in the index.

Search robots slow to notice updated content

This is an offshoot of the problem above – when you change a page, it may take some time before those changes filter through into the search results.

Again, this is because of how pages are rendered when using React, which can lead to updated pages going unnoticed for longer by Google’s finite crawler resources.

‘Deep’ pages crawled rarely (or never)

Pages buried deep in your site’s hierarchy are less likely to be reached by Googlebot, especially if it has already spent significant render budget higher up in your hierarchy.

Once again, pre-rendering can be a valuable tool to help the search crawlers penetrate deeper into your website’s folder structure, before they run out of assignable crawl budget.

[Ebook] Crawlability

Ensure that your websites meet search engine requirements for crawlability to boost SEO performance.

How to optimize React e-commerce sites

With some best practices for React e-commerce SEO, you can give your site – and your individual pages – a better prospect of making it into the search index.

Unique Page URLs

The best practice counterpart to the dynamic URLs issue mentioned above is to give every page its own unique and static URL on your website. Content is then associated with a single permanent location – a canonical URL for the page – which search engines can crawl, index and rank with confidence in their results pages.

React Router is the way to achieve this on React websites. React Router can give every page on the site its own permanent URL, as well as aligning the UI with a specific URL.

Isomorphic React

Isomorphic React is a way to enable server-side rendering, to relieve the pressure on crawlers’ render budgets. It works by detecting when JavaScript is disabled on the client side, and creating a server-side rendered form of the website’s HTML. If client-side rendering is available, the website will load as a React SPA in the usual way.

By doing this, Isomorphic React overcomes the visibility problems for search crawlers, improving the discoverability of pages, without any detriment to the smooth and seamless React experience for human visitors with client-side JavaScript turned on.


Pre-rendering is another way to achieve a similar outcome to Isomorphic React. It works by generating a cached version of the rendered HTML, which crawlers can then access instead of the unrendered source code.

Human visitors receive the client-side React website, again giving them all the benefits of a smooth, lightning-fast SPA. There are many pre-rendering services to choose from, including Google Puppeteer, and SEO4Ajax.

Optimizing Metadata

Finally, make sure your content has optimized metadata – still an important element in on-page SEO, even after all these years.

As well as a unique, optimized URL, each page should have a unique title tag and any other relevant meta tags, helping pages to stand out, and signposting the search robots to associate the page with a certain topic or primary keyword.

React Helmet is a good way to achieve this, by giving direct access to edit metadata SEO on React e-commerce websites. In this way, you can combine old-school SEO best practice and classic metadata keywording, with all the benefits of a modern, mobile-friendly React e-commerce website.

Dan Taylor is the Head of Technical SEO at, an international bespoke SEO agency. He’s an experienced, multi-discipline digital marketing professional, predominantly specializing in SEO (technical and international). Dan has worked with a wide variety of companies, from LSE listed major British brands, to pioneering unicorn tech start-ups in San Francisco. Dan also speaks at SEO and tech conferences internationally, and writes for a number of industry blogs.
Related subjects: