At Oncrawl, you can ask us everything about your reports. Sometimes the Oncrawlers (our users) ask some tough questions which helps us improve our knowledge base. Depth distribution is always a source of questioning.
What is the best depth distribution for my website?
There is definitely no good answer here… It depends a lot on your website size and your type of content. If you have a little corporate website with just a few hundreds of pages, you could probably get a perfectly optimized website with 5 levels of depth. But if you have a large (over 100k urls) or a very large website (1M+), having more than 15 levels might not be a very good idea.
To keep it factual, I computed the data over 1000 websites to see how many depth levels they have. I came up with this chart showing the average max depth levels by size of a website:
But I think the main question here is not about the “greatest” depth distribution of URLs that matters. You should pay attention to content distribution. Are your top priority pages well located within your website? Is there something rational about why a particular category of pages is deeper that another one? Here is a good exemple of a website on which SEOs have think a lot about their content distribution:
Checklist for a good start in analyzing your SEO and your page depth levels.
When you are analyzing your page depth, you should go with this quick checklist:
- Thinking about the first guidelines of Google were not to have more than 100 links on a webpage, and as you know how you present content on your website, you should be able to theoretically guess what your depth distribution looks like. If something is weird, you should investigate.
- Look at the distribution of your groups of pages alongside your Depth Distribution. Does it meet you business constraints? If you are selling shoes and accessories, you main business is to sell shoes, so all the pages presenting products or brands should be the closest to your homepage, and your accessories pages located deeper.
- Pay attention de the loading time by depth. If you set up a cache system with a standard configuration, chances are your deepest pages are not in your cache. So your loading time performance is decreasing as you dig deeper and deeper in your website. That might hurt your SEO.
- Check how InRank flows with your depth levels. IMHO, and thanks to all the data we are gathering from all the websites we crawl, when you have a drop of more than two points from a depth level to the next one, it’s typically because you have not optimized your internal linking structure. Again, you should be accurate analyzing the InRank by checking it for each group of pages. At the end of the day, I think you should always keep an eye on the Inrank of your Top priority pages. Because you can have a depth of 5 for a particular page, and drive a lot of link juice to it, so Depth might not be an issue for certain pages.
To follow up on this type of question, you should also check with a combined analysis with your logs data how depth is impacting the Crawl Ratio of your pages. To give you a few more insights, I decided to check over 300 websites how Google is behaving depending on the Page Depth level. Here are the results:
If you have good stories about how you handle a page depth optimisation, I would love to hear from you in the comments.