Using SEA and vertical bots metrics to improve SEO

October 30, 2018 - 5  min reading time - by Nicolas Duverneuil
Accueil > Technical SEO > Using SEA and vertical bots metrics to improve SEO

The OnCrawl metrics for SEA and vertical bots give SEOs an overview of the behavior of additional web crawlers belonging to Google.

Google uses about a dozen different bots to explore the web. However, Google doesn’t provide a lot of information about its bots other than googlebot (desktop and mobile), which explores the web in order to index and update pages. This is why we don’t know as much about the behavior of “secondary” bots, and why we lack information about their role in the world of Google searches.

The information gleaned from log files and provided in OnCrawl is intended to shine light on the interaction between these bots and SEO.

What bots are SEA and vertical bots?

The official list of bots includes:

  • AdWords bots that check the quality of pages for which ads were purchased in Google Ads
  • AdSense bots, which are supposed to check the contents of a page in order to provide appropriate ads as part of the monetization of a page.
  • Media bots, which find images and videos on a website and index them for use in the SERPs for image or video searches.

We’ve also observed additional bots, such as the AMP bot and a bot dedicated to Ajax, which was finally grouped as an instance of googlebot-desktop. Some of these are not yet included in the official lists. For this reason, we’ve decided not to include them in the OnCrawl reports at this time.

When might I have other Google bots on my site?

We know that Google uses googlebot and googlebot-mobile to index pages used in responses to search queries. These bots appear on all of the websites that don’t disallow them, with the goal of indexing all of the web. But when do we see the other bots?

If you invest in ads with Google Ads, no matter the size of your site or the amount you invest, you will see hits by the AdWords bots.

If you use Google’s monetization in order to allow Google AdSense to place ads on your site, you’ll note visits by Google AdSense bots, which can be identified by their “mediapartners” user-agent.

The bots that index images and videos make vertical search possible. They can explore any site looking for image and video files.

How to use it: monitor your images’ SEO

Google is currently improving the visibility of visits to your site that come from Google Image search: in Google Analytics, image searches in Google are now listed as a separate source since the summer of 2018.

With the increase of image searches, it is more necessary than ever to place images of products in Google Image SERPs. Image results are a growing source of organic traffic and bring an important flux of consumers for the majority of e-commerce sites: as early as 2014, 63% of image searches led to website visits. Multiple indicators show that the content of this source is drawn from the images explored by Googlebot-images.

By following the Googlebot-images crawler, you are able to make sure that your images continue to be indexed and updated correctly. As when monitoring the behavior of googlebots to better understand standard SEO, the spikes and other abrupt changes in the behavior of Googlebot-images are often indications of problems with your rankings in Google Images.

For successful image SEO, base your strategy on the best practices for images:

  • Monitor the behavior of googlebot-images
  • Use an appropriate size of the image
  • Provide Google with a useable image size
  • Name your images in a way that includes the keywords you’ve chosen
  • When used on your site, add appropriate alt text

How to use it: quickly identify SEO-SEA cannibalization

When SEO campaigns for organic search and SEA campaigns for paid traffic are not perfectly coordinated, it can cause click cannibalization on certain keywords..

This happens when there are instances of increased investment for paid search for a keyword. For the same search, a paid result and an organic result appear for the same site. In these cases, the URL ranked by SEO continues to maintain its position and earns the same number of impressions. However, it doesn’t attract as many visitors as before.

Studies have shown that this drop in CTA is due to the presence of an ad for the same site that appears on the SERPs before the first organic result.

By monitoring AdWords bots, you are able to see modifications in SEA campaigns that run the risk of cannibalizing your website’s SEO: spikes in hits by the google-adwords and google-adwords-mobile crawlers can be the result of an investment in paid traffic through Google Ads. This way, without using additional tools, OnCrawl helps equip you with indicators for SEA campaigns.

[Case Study] Driving growth in new markets with on-page SEO

When Springly began looking at expanding to the North American market, on-page SEO has been identified as one of the keys to a successful start in a new market. Find out how to go from 0 to success with technical SEO for your content strategy.

How to use it: decrypting the magic “crawl budget”

The old Google Search Console interface provides a “crawl budget” for your site. However, the information displayed in this section doesn’t match the graphs obtained based on server log data for the bots responsible for SEO indexation (googlebot, googlebot-mobile).

By studying the behavior of all bots belonging to Google, we’ve been able to see that the data in Google Search Console isn’t limited to the behavior of googlebot, but instead is a cumulation of the behavior of all of the bots belonging to Google.

The aggregation of all of these bots has the side effect of hiding certain behaviors and of creating false alerts for other behaviors. The breakdown by bot offered by OnCrawl aims at decrypting the information hidden behind the Google Search Console graphs. This helps answer the following questions:

  • Does a spike in the data, pages or bytes indicate a global trend, or is a particular bot responsible for it?
  • Is a drop in crawl budget a sign of a problem with the indexation of your pages or is it due to a non-SEO bot?
  • Can a observed problem (in bytes) be attributed to a single bot?
  • How can the lack of loss of visits by a bot be corrected?
    • AdSense: is the bot blocked by a script or by robots.txt?
    • AdWords: in the case of automated investments, has a problem occurred?
    • Images / videos: does the robots.txt restrict bot access to video or image files?
    • Googlebot: should you adjust your SEO strategy?
  • How is the crawl budget distributed among the different bots?
  • If the general graph does not indicative of the behavior of the SEO bots, what are the differences?

A word of warning, however: if you are running careful comparisons between the information in Google Search Console and the information in your server logs, note that the statistics in Google Search Console are recorded on California time. For a website on the US East Coast, 3 hours of each day (or 9hrs for sites in Western Europe!) are counted as part of the previous day.

Nicolas is an SEO expert currently working as a Technical SEO Analyst at Cdiscount, the French largest ecommerce website. Former Customer Success Manager at OnCrawl, he is a specialist in big e-commerce websites and loves the geek side of the SEO. You can reach him on Twitter.
Related subjects: