It is also the only scripting language supported by all web browsers that handle client-side scripting.
However, Google has changed their old AJAX crawling scheme guidelines of escaped-fragment #! URLs and HTML snapshots last October, 2015. They are now often able to render and understand web pages.
Key pain points about crawling JS
- Links or click zones must have ng-href attribute (not ng-click) if using the Angular.js framework.
- href attribute will be understood by crawlers only if it is on an Anchor link.
Beside that, Google still has some limitations regarding JS crawling:
- Google needs unique and clean URLs with links located in a HTML.
- The rendered page snapshot is taken at 5 seconds, so content needs to be loaded by that time, or it just won’t be indexed for each web page.
- All the resources of a page (JS, CSS) need to be available to be crawled, rendered and indexed.
Google also says that:
“If you’re starting from scratch, a good approach is to build your site’s structure and navigation using only HTML. Then, once you have the site’s pages, links, and content in place, you can spice up the appearance and interface with AJAX. “
Linking and content are part of the most important ranking factors to analyze. As search engines can find and index your JS-rendered content; it’s critical to know what they are seeing and how it’s impacting your SEO.
- Find text content: get accurate content quality analysis with the inclusion of content like product reviews and user-generated comments often rendered in JS
- Pre-render of Angular/backbone Js to transform ng-href attributes on JS links to keep exploring without the a@href markup, as Google will do. You will ensure that all your website can be crawled and enter in our modeling process.