Log files analysis case study

April 7, 2020 - 6  min reading time - by PJ Howland
Home > Technical SEO > Log files analysis case study

Log file analysis takes time, less time with Oncrawl, but time nonetheless. SEOs have a long laundry list of optimizations and audits, and with log files being one of the heavier items on the list it has a nasty habit of creeping to the bottom of said list.

It’s human nature to procrastinate, but it’s easier to get started when you know what’s waiting for you.

A recent log file analysis at 97th Floor brought their client a 25% increase in organic revenue in just 30 days after the analysis was completed.

What you need to know before you run a log file analysis

If you’re new to log files, this article isn’t here to cover the basics in full. Instead, read this article.

To keep things short, know that log file analysis is essentially a review of the web server logs that have been filed away on your site. Logs record instances of bots (like Google) interacting with your site.

Many ask, “when should I run a log file analysis?” And the correct answer is simply, “Yes.”

Here’s why: Crawlability is the foundation of any technical SEO rollout. Without crawlability sites won’t get indexed. Without getting indexed, they won’t rank. And without ranking… You get the point.

There are a few ways to enhance site crawlability, but perhaps none more effective and all-encompassing than a log file analysis. This is why it should serve as the backbone of any technical SEO rollout. So whether it’s at the beginning of a technical rollout, or whether you’re years into a mature campaign (but haven’t looked at your log files recently), it’s time for a log file analysis.

And for anyone still on the line about running a log file analysis, I will add this advice. To any SEO who believes they’ve done everything in the book but is still struggling to get that stubborn keyword on page 1: run that log file analysis.

Oncrawl Log Analyzer

Log file analysis for bot monitoring and crawl budget optimization. Detect site health issues and improve your crawl frequency.

Log file analysis in action: observing findings

Reviewing your log files can produce a myriad of findings including:

  • Bot crawl volume
  • Crawl budget waste
  • 302 redirects
  • Response code errors
  • Crawl priority
  • Duplicate URL crawling
  • Last crawl date

But these findings by themselves don’t help. It takes an SEO that can see past the issues to create solutions that will make the most from a log file analysis.

At 97th Floor, we have an ecommerce client that sells unique products with high price tags. Virtually all products are unique, and when the product is sold, our client was removing the page from the site. This made getting any kind of page-level growth difficult. It also caused a lot of crawlability confusion with Googlebot.

Needless to say, they were fighting an uphill battle with Google before we got involved. When onboarding with the client we immediately ran a log file analysis.

Our log file analysis produced many findings (as they usually do) but these three stood out:

  1. A large number of redundant subfolders were frequently crawled by Google
  2. Many pages that returned a 404 error were still being crawled by Googlebot. Because they had been removed from the site, they couldn’t be found in our initial site crawl
  3. Unimportant subfolders were being crawled more than key landing pages

As mentioned earlier, log file analysis doesn’t stop when the analysis is over. It continues on through the action items and into the implementation and execution. In our client’s situation, they were wasting crawl budget from Google on pages that simply didn’t move the needle.

Our new pages were well optimized, but were not getting the traction we needed to result in any meaningful rankings.

When we uncovered these three problems from our log file analysis, it made sense why our rankings weren’t higher. Google was using our crawl budget to look at pages that weren’t well optimized, or that weren’t optimized at all.

In order to get the rankings that would bring in the traffic we needed to see success, we first needed to resolve the items from the log file analysis.

In this case, it was obvious that the crawl budget was being wasted on unimportant pages, a common finding with a log file analysis.

Solutions from log file analysis

Addressing crawl waste in redundant subfolders

Because our client was an ecommerce site, we saw a large number of subfolders that were duplicated across the site. These subfolders were mostly category pages that were so old that the information they held was vastly out of date, and nearly impossible to organically discover.

But Googlebot had not only discovered them. It kept coming back frequently to recrawl them, eating into our crawl budget.

Our solution was to delete and redirect these redundant subfolders to more appropriate and relevant subfolders. We had just launched a completely revised category structure that was supposed to help us rank for some larger keywords. Can you guess where we redirected these redundant subpages?

Redirecting these old and forgotten pages to their newer, more optimized, counterparts gave us a leg up in the SERPs.

[Case Study] Optimize organic search traffic using log files analysis

The National Business Research Institute had redesigned their website and saw a drop in organic search traffic. This case study focuses on how NBRI used Oncrawl to optimize their SEO performance following their redesign.

Correcting crawl waste in unimportant subfolders

This point seems similar to the last point concerning redundant subfolders. Redundant subfolders were forgotten and lost duplicates of our current versions. The distinction between redundant and unimportant subfolders is that we’re now talking about subfolders that were still relevant, but just not cornerstone pages for search.

The solution here was anything but straightforward. Since we don’t know the root cause of the situation we needed to prescribe a broad solution that involved a little bit of everything including:

  • Inserting strategic internal links from our low priority (but highly crawled) pages to our high priority, SEO-optimized pages
  • Rearranging the sitemap to include more important pages higher in the .xml file
  • Revising the rel=”canonical” and meta robots information on our higher priority pages
  • Revisiting the robots.txt file to ensure nothing was being blocked that shouldn’t be (Large ecommerce sites especially need to check this.)
  • Deleting and removing unnecessary pages

Clearing out dead pages

We had already conducted a site audit where we crawled the site and found all 404 errors that an internal site crawl could identify.

But that’s the beauty of a log file analysis, you’re not pulling limited data from your own internal crawl. You’re viewing exactly what Google is seeing. Essentially it’s a site audit from an outsider’s perspective—the most important outsider: Google.

The fix here was simple, which helped make up for all the time spent on the previous point. We 301 redirected these old orphaned pages to their optimized counterparts on the site. Done.

The results of a log file analysis

The best part of our log file analysis for this client was the fast results it earned them.

As mentioned before, it earned the client a 25% increase in organic revenue in just 30 days following the implementation of the action items above.

During this time we also saw a small increase in conversion rates for organic traffic. Because this uptick happened during a slow season, there’s no seasonality at play here. And the fact that conversion rates increased means that the traffic didn’t just come in greater numbers, it was also qualified traffic.

The traffic was likely more qualified because after effectively redistributing the crawl budget our client’s site had with Google, we drew attention to more intent-based pages, thus earning better rankings on higher intent keywords. This meant we were drawing users lower in the funnel to the right pages to assist them in their journey.

It may sound sappy, but at 97th Floor our credo is that we make the internet a better place. Log file analyses do this because they compensate for the shortcomings of Google’s inaccurate crawls. This brings higher quality pages to the top of SERPs, making the internet search experience that much better for everyone.

Next time you’re faced with a crawlability problem, I hope you won’t think twice about executing a log file analysis, presenting solutions, seeing them through. You’ll see your site earn more traffic and conversions.

PJ Howland is the VP of Industry Insights at 97th Floor. His role is to champion great content and advocate 97th Floor’s mission; make the internet a better place. When he’s not shelling out content, he can be found at home living his lifelong dream of hobby homesteading.
Related subjects: