5 actionable uses for OnCrawl’s API

March 3, 2020 - 3  min reading time - by Nicolas Duverneuil
Accueil > Technical SEO > 5 actionable uses for OnCrawl’s API

Whether for data recovery, managing a WordPress site, handling connected objects, monitoring or any other use, APIs are increasingly part of our daily life, whether at home or at work.

Obviously, OnCrawl’s API is no exception ????.

OnCrawl allows you to access all of its services and features through its API. It allows you to launch/program crawls, retrieve raw or analyzed data, etc…

Of course, this can all be done directly in the application, but using the API can allow you to take things a step further.

Creating custom dashboards

A common use of an API for data retrieval is the integration of data into dashboards.

OnCrawl already provides many of dashboards. The Custom Dashboards feature also allows you to create your own dashboards. However, you may want to see this data or these reports mixed with other KPIs from external sources.

For example, you might want to cross your crawl data (structure, number of words, elements retrieved via scraping, duplication rate, …) or log data (pages crawled or not by Googlebots) with business data in order to create focused analyses whose changes you want to track over time.

The API will be the most efficient and relevant way to retrieve these data, which will be sent to and processed by your other tools.

Automation

Crawl needs differ from site to site, and from team to team. Using the API an allow you to adapt OnCrawl to each user or to each team.

For example, verification crawls can be launched automatically a few hours after a website goes live to ensure that there has been no regression, that pages respond with the correct status codes, etc.
This would involve adding calls to the OnCrawl API to the deployment script for the new version of the site, which would then launch of the appropriate crawls.

Building smart internal linking strategies

As an SEO, one of your goals might be to make sure that nearly all of your pages have an equal chance to be crawled and ranked.

One possible approach would therefore be to ensure that all pages benefit from a relatively homogeneous Inrank (an indicator of internal popularity of the pages).

Using recent crawl data (current Inrank of the target pages, pages indicated as popular, etc.), you can (re)calculate relatively dynamically the internal linking structure of the site in order to reduce the depth, homogenize the Inrank, etc.

Obviously, the field of possibilities extends far beyond internal linking. The logic to remember is that the API can allow you to include crawl data, or even analyses, in the strategy you use to create/manage your site.

Customized alerts

As an SEO, you most probably follow a number of KPIs, both common KPIs and ones that are specific to your site.

The email you receive after each crawl will allow you to compare KPI changes from one crawl to another. Then, whether it’s the ratio of active pages in a specific category, recent articles that don’t generate traffic, poorly positioned but frequently crawled pages, etc. you can find these cross-analyzed data in OnCrawl.

If you want to be alerted as to changes in these data with every crawl without having to log in to the application, the API will allow you to retrieve the information directly from OnCrawl and integrate it into your alerting tools. You can even send it to services such as IFTTT or Zapier.

Oncrawl Log Analyzer

Log file analysis for bot monitoring and crawl budget optimization. Detect site health issues and improve your crawl frequency.

Automated SEO audits

Whether you are a webmaster or a consultant, you might want to automate all or part of your SEO audits (or at least the technical aspects).

Once again, the API can make your life a lot easier!

Most of the onsite elements of an SEO audit are easily found in OnCrawl. So why not put aside your Excel checklist and replace it with an “in-house” tool that will check the different elements of your site one by one using the API?

By taking the time to fully break down each element in the analysis and by preparing the appropriate queries in OnCrawl, you may be able to effortlessly generate your audit at the click of a button.

Using the API here will save you valuable time that you can spend interpreting the data or doing other things (like visiting the coffee machine more often).

There is no shortage of examples of how the OnCrawl API can be used.

To summarize, keep in mind that the OnCrawl API gives you access to all the data available via the Data Explorer, allowing you to go even further than the application itself. The only limit is your imagination.

Nicolas is an SEO expert currently working as a Technical SEO Analyst at Cdiscount, the French largest ecommerce website. Former Customer Success Manager at OnCrawl, he is a specialist in big e-commerce websites and loves the geek side of the SEO. You can reach him on Twitter.
Related subjects: