We want to make Big Data technologies accessible and actionable for any SEOs . We believe in what we do and are proud to have created the next semantic SEO crawler generation.
As an in-house SEO guy, I’ve always been surprised about how badly we were looking at OnPage SEO that focuses on building quality link profiles. However, we can do more monitoring on your data with onsite SEO and thus understand what’s good and what’s hurting your rankings. Let me then explain you why we built it.
We are only human so tracking our onpage changes is really a pain. When it comes to large websites it becomes practically impossible, especially if you run them with user generated content (hello marketplaces !). We don’t have an exhaustive overview of our content – product pages, listings, how do they perform inside my website? Are they popular? Do they get the right amount of juice? – It’s not humanly possible. You can just guess.
A few years ago, you could still rely on Google Webmaster Tool, but we all know that it doesn’t give you a clear overview, neither access to an up-to-date dataset. I was always thinking further than “ok, I have duplicate title tags”, like how can I be sure that my top products pages are well crawled, indexed and ranked by Google? GWT didn’t tell me that. So I really needed something more « actionnable ».
When I first met my partner @Tuxnco, who was working at Exalead (a French competitor of Google), he explained to me all the data they used to understand a website.
We decided that these data should be available to all web workers. As we were really good at crawling the web, we tried to figure out how we could help the seosphere. So we came up with OnCrawl.