[Webinar Digest] SEO in Orbit: JS in 2019: modern SEO for modern users

September 24, 2019 - 11  min reading time - by Rebecca Berbel
Home > Webinars > SEO in Orbit: A JS State of the Art in 2019

The webinar SEO in Orbit: JS in 2019: modern SEO for modern users is a part of the SEO in Orbit series, and aired on May 2nd, 2019. For this episode, we discussed how JavaScript, its use on websites, and its support by search engines have undergone major improvements since we first realized that JS could be problematic for Google. We explore with Mike King the new possibilities to reconcile interactive, professional websites and search engine crawl technologies in 2019.

SEO in Orbit is the first webinar series sending SEO into space. Throughout the series, we discussed the present and the future of technical SEO with some of the finest SEO specialists and sent their top tips into space on June 27th, 2019.

Watch the replay here:

Presenting Michael King

An artist and a technologist, all rolled into one, Michael King recently founded boutique digital marketing agency, iPullRank. Mike consults with companies all over the world, including brands ranging from SAP, American Express, HSBC, SanDisk, General Mills, and FTD, to a laundry list of promising startups and small businesses.

Mike has held previous roles as Marketing Director, Developer, and tactical SEO at multi-national agencies such as Publicis Modem and Razorfish. Effortlessly leaning on his background as an independent hiphop musician, Mike King is a dynamic speaker who is called upon to contribute to conferences and blogs all over the world. Mike recently purchased UndergroundHipHop.com a 20 year old indie rap mainstay and is working on combining his loves of music, marketing and technology to revitalize the brand.

This episode was hosted by François Goube, co-Founder and CEO at Oncrawl.

JavaScript: the big picture

Most interactive functionality on websites today are currently powered by JavaScript.

JavaScript isn’t the only option for interactivity: CSS has a lot of native interactivity built in.

Back in the day, the thinking was that for SEO, you needed a completely different static version of the page in order to work with search engines, but that’s no longer realistic. Because much of the web has adopted this type of interactivity, search engines have needed to keep up.

Consequently, SEOs have also needed to modify their approach on how we optimized pages driven by JavaScript.

How user agents see pages

Rendering addresses how a webpage is seen by different user agents.

Historically, we’ve always thought of how search engine crawlers see pages as being the HTML version of the page. This means that the interactive options powered by JavaScript weren’t, in this traditional thinking, seen by search engines.

This contrasts with a fully-rendered version of the page, which corresponds to how a user sees it in the browser.

Search engines can now crawl in a headless format, which allows them to see what users see in a browser. However, they don’t always do this. Rendering is still a different process from crawling. Rendering may or may not be carried out, or it may be carried out at a later time.

JavaScript for increased speed on interactive pages

One of the major advantages of JavaScript in 2019 is facilitating increased speed on interactive pages. Using JavaScript means that you don’t have to load an entire page in order to get new information on it.

Essentially, you use AJAX to repopulate parts of the page using JSON objects. This is a lot faster than having to download every resource on the page.

This makes for a better web, and allows applications, from Oncrawl to Gmail, to exist.

JavaScript and UX

The compatibility between JavaScript and UX depends on the capabilities of the developers and UX experts working on a given website.

There are instances of sites that use frameworks when they could have just been flat HTML. There are also examples of sites that use JQuery to update a single object, where there’s no need to download and use the whole JQuery library.

This comes down to the same issue we’ve always had to deal with in SEO: are people using the right tools in the right ways to create optimal experiences?

Even when JavaScript makes the user experience much faster, it can make initial pages much slower to load. If you check the developer tools when you open a web page, almost every page has slow times in the JavaScript load times.

Computational expense

JavaScript used to be a major SEO issue because search engines had a hard time crawling it, but that has changed.

The biggest issue today is computation expense. Having a JavaScript renderer makes your costs way higher. A page that might take a few milliseconds to download as plain HTML might take 30 seconds with JavaScript in a headless browser.

Because of the current cost, Google has to determine whether or not it’s worthwhile to render pages. Some pages have little to no material differences in content. It isn’t necessarily worthwhile for Google to render pages like this; they need to make sure that when they do render pages, it really is necessary to do it.

Mike predicts that as the computational resources become faster and cheaper, Google will have less of an issue with JavaScript. Though rendering is something of an “optional” or “secondary” step for Google today, it will likely become less of a question of WRS rendering later, and might eventually be done in parallel with the crawl itself.

François reveals that when crawling, JavaScript rendering is approximately ten times more expensive in infrastructure costs than a standard crawl. Even if Google seems to have infinite resources, a discrepancy of this magnitude means that even Google will look at the cost.

For SEOs looking at how to optimize a page for search engines, therefore, looking at how you use JavaScript can be a good point to cover.

Disparity between rendered and non-rendered content versions

One of the main technical issues for SEO is the disparity between rendered and non-rendered versions of content.

[table id=5 /]

Mike also often sees issues concerning the implementation of SSR architecture. Many solutions require a cache refresh, which can take a long time. If the page is visited by Google during the cache refresh, Google may see it as a 5xx error, and that URL may fall out of the index.

Despite the fact that Google keeps encouraging people to do dynamic serving, progressive enhancement is still the better way to do it in order to ensure that the bulk of your content is available to any user-agent that come to your page.

Native vs 3rd party server-side rendering

Almost all modern frameworks have solutions for SSR. Some use add-ons; others, like React, have native options like render to HTML string.

If your framework doesn’t have support for SSR, you can use solutions like Rendertron or Prerender.io in order to effectively create HTML snapshots.

Mike recommends using SSR. When you add another server for third-party rendering, you’re adding latency. We want the site to be as fast as possible. This also creates another point of failure in the process.

[Case Study] Improving rankings, organic visits and sales with log files analysis

In the beginning of 2017, the team of TutorFair.com asked for Omi Sido’ SEO services to help them. Their website was struggling with rankings and organic visits.

Choosing a framework

– Suitability for the type of problems you’re trying to solve

First and foremost, consider what you’re trying to solve with a JavaScript framework. Each framework has different strengths and weaknesses.

– Angular vs React vs new frameworks

Consider whether you’re using a new framework just because it’s new. Angular and React, for example, are pretty entrenched at this point. There’s code and experience out there that can be leveraged.

Pick something that is both tried and true and that will be around in five years.

For Mike, React seems to be the way to go at the moment. He prefers native SSR. However, you can create snapshots with a variety of different tools, so you do have a choice depending on your technical requirements. He doesn’t believe any of the major known frameworks have any specific issue that means you might want to avoid them entirely.

Up until the latest versions, Angular was not as good as React for a number of reasons. This is curious, because Angular comes out of Google itself. However, recent versions have helped bridge this gap.

– Advantages of pushing data from central source through an API

Bing has recently stated that they now accept websites that push their content to the search engine through the API, removing the need to render that content. In some ways, this makes a lot more sense.

This is essentially going backwards: we used to notify search engines to come take a look at a given URL. But this actually has the advantage of speaking to an architecture for web applications that make them more portable. It makes sense for everything (app, mobile app…) to be built from a central source in the form of micro-services in which you’re able to push everything out from a single API.

For Mike, if the web goes that way, it increases portability to new modes of consuming content, and makes a lot of sense.

It would also cut down on the computational expense of crawling from the search engine’s side.

Developers and SEO awareness

Mike expects developers to understand how the choice of framework and implementation affect SEO. Furthermore, he sees more and more that developers are aware of SEO concerns associated with site technology choices.

For a long time, developers took the position of “Google will figure it out”, but there’s a growing awareness that possibly stems from client-side rendering being so problematic for so long.

For example, libraries that support SSR, like NextJS, are things that come out of the developer community, rather than from an SEO initiative.

– Evangelizing SEO for developers

Mike doesn’t have a specific method to evangelize technical SEO for developers, but usually takes an approach that can be applied to anyone.

To get someone to do something for you, you appeal to their self-interest. What are developers measured on? What do they care about? Mike then tries to align SEO with these things and to demonstrate how this impacts developers’ abilities to meet their own goals. You need to show what the value is going to be to a developer’s career, to short-term OKRs… This makes implementing SEO as valuable to developers as it is to SEOs.

– Importance of understanding coding as an SEO

Mike is known for his position on the “technical SEO renaissance“.

At the heart of this change is how SEO works. You don’t have to be a developer to do SEO, but being able to understand and communicate with developers is critical. Skills that matter include:

  • Curiosity about development issues
  • Familiarity with code even if you aren’t able to write it
  • Ability to compare the view-source version of a page and the computed DOM version of a page to pinpoint where issues occur

– Qualities of good developers that are essential for great SEOs

If Mike had to choose between a marketer and a developer to turn into a great SEO, he’d probably go with the developer, for the following reasons:

  • Logical approach
  • How developers deconstruct problems
  • Communication with great attention to detail

Prioritizing content vs payload

When optimizing a website that uses JavaScript, Mike would put his first efforts onto making sure the content is visible and understandable.

The payload, while critical, is what you want to focus on once you know that the content is accessible.

Mike King’s rel prerender trick

This is a trick from several years ago.

It consists of using the GA API to find the page that the user is most likely to go to based on the current page, and doing rel=prerender with that.

Mike was concerned that this was going to stop working with Google’s recent Chrome update on how rel-prerender works; it’s now called no-state prefetch, and Google no longer executes all of the JavaScript during pre-render in the invisible tab. However, it still significantly speeds up the user experience through each subsequent page.

Googlebot User Agents

There are a lot of Googlebot User Agents that people aren’t mindful of.

If your website or your CDN blocks these user-agents, it can prevent your ability to take advantage of search features and tools. This includes:

Keys to increasing site speed

Most things to make a site super-fast come from the server side:

  • Make sure you’re on a CTN
  • Use H2
  • Optimize your database
  • Consolidate queries

On the front end, though:

  • Look at code coverage
  • Manage resource loading: DNS prefetch, rel-prerender, rel-preload

Dealing with the trailing slash/non-trailing slash problem in React

The way React routes URLs, it’s easier to use a rel=canonical than to set up a 301 redirect from one URL to the other.

Pre- or Post-DOM HTML

For interpreting hreflang, Google says they use pre-DOM HTML. Mike believes that Paul Shapiro was testing this recently and might have found a different answer.

The bottom line: use pre-DOM, because you never know whether Google will render the page or not.

You might even want to consider placing hreflang in the header to be sure that search engines will always see it, no matter what.

Top tip

“As an SEO you need to be able to communicate with developers.”

SEO in Orbit went to space

If you missed our voyage to space on June 27th, catch it here and discover all of the tips we sent into space.

Rebecca Berbel See all their articles
Rebecca is the Product Marketing Manager at Oncrawl. Fascinated by NLP and machine models of language in particular, and by systems and how they work in general, Rebecca is never at a loss for technical SEO subjects to get excited about. She believes in evangelizing tech and using data to understand website performance on search engines. She regularly writes articles for the Oncrawl blog.
Related subjects: