- Use cases
- Customer Success
- LOG IN
- Start free trial
Psst… There is a secret I want to tell you.
Your site has a “crawl budget” set by Google.
This is the secret metric used by Google to measure two things:
This article will focus on the first point.
Improving the build quality of your site will increase your crawl budget.
The bigger your crawl budget the more frequently Google will stop by and read your pages.
Let’s start by sharing an understanding of what a crawl budget is.
Google uses a special software program called a web crawler (or spider) to read pages on your site.
They call this web crawler, Googlebot.
Crawl budget is the term to describe how often Googlebot will crawl your pages.
By optimizing your site you can increase your crawl budget.
Google has said that your crawl budget is a combination of:
As these metrics improve, you will see Googlebot visiting more often. Reading more pages on every visit.
Once Google crawls a page it will add the content to the Google Index. Which then updates the information shown in Google Search results.
By optimizing for crawl budget you can improve the speed of updates from your site to Google Search.
Google has a tough task. They need to crawl and index every page on the internet.
The power they need to do this is huge and they can’t index every page.
Optimizing your crawl budget will give your site the best chance of appearing in search.
Improving a site is about making Googlebot’s time on a site as efficient as possible.
We don’t want:
All the above is wasting the precious resources of Google and could see your crawl rate drop.
A lot of what you need to do as part of technical SEO is the same as optimizing the Crawl Budget.
Let’s take a look next at creating the perfect page for Googlebot.
OK so maybe not the perfect page but we should try and improve the page as much as we can.
Let’s look at some common on-page issues that you can improve.
One way to improve the crawl budget is to make the page fast.
Fast pages make Googlebot faster and this is a sign to Google that the webserver is “healthy”.
Google has already said that page speed increases the crawl rate:
Making a site faster improves the users’ experience while also increasing the crawl rate.
For a list of page speed improvements have a look at this in-depth website performance review. It has 30 steps to improve website performance.
As a smart SEO you know before starting any optimization you need to track the changes.
You need to pick a data point with two properties:
So what is the data point we should track for crawl budget?
We said earlier that Google uses two factors when deciding on a crawl budget:
Since we are technical SEO’s our job is to improve the crawl rate.
So this is the data point that we should track.
So how do we track the Googlebot crawl rate?
We need to use your web server access logs.
The logs store every request made to your webserver. Every time a user or Googlebot visits your site a log entry gets added to the access log file.
Here is what an entry would look like for Googlebot:
127.0.0.1 - - [11/Nov/2019:08:29:01 +0100] "GET /example HTTP/1.1" 200 2326 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
There are three important data points in each log. The date:
“GET /example HTTP/1.1”
And the user-agent which tell us that it’s Googlebot making the request:
"Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
The above log is from an Nginx webserver. But, all web servers such as Apache or IIS will have a similar access log entry.
Depending on your set up, you may have a Content Delivery Network (CDN). A CDN such as Cloudflare or Fastly will also create access logs.
Analyzing an access log manually is not the most fun although it is possible.
You could download the access.log and analyze this using Excel. Yet, I would recommend that you use a log analyzer such as the one from OnCrawl.
This will allow you to see the Googlebot crawl rate on a graph and in real-time. Once you have this monitoring setup to track the crawl rate you can start to improve it.
Now we know what we are tracking we can look at making some improvements. But, don’t make many changes at the same time. Be methodical and make changes one by one.
Build, Measure, Learn.
Using this technique you can adapt the changes you are making as you learn. Concentrating on the tasks that are improving the crawl rate.
If you rush and change too much at once it can difficult to understand the results.
Making it hard to tell what has and has not worked.
Over time as the page improves you will see an increase in the crawl budget as the crawl rate goes up.
We have covered exactly what a Crawl Budget is.
As a Technical SEO, you have the power to increase the crawl rate of the site.
Improving the technical health you can make Googlebot’s time on your site efficient.
Track the crawl rate using your logs for accurate results.
Use Build, Measure, Learn as a technique to make one change at a time and improve as you go.
Over time your crawl rate will increase. Your pages will appear quicker in Google Search Results. And users will have a great experience on your site.