SEO means working with a large amount of data. A single website can have even up to thousands of subpages, key words, internal links. If it’s operating in a competitive industry, it needs external links from thousands of domains. And what if you’re optimizing many websites? You end up with some real Big Data.
Let’s add the Google algorithm variability, unforeseen actions of the competitors, disappearing backlinks. How do you control all of this?
Photo by Luke Chesser from Unsplash
The market doesn’t like it when nothing’s happening, and it’s no different in the SEO industry. All features and indicators relevant from the perspective of SEO can be monitored with the application of specialized tools. Unfortunately, there’s no such thing as a one single tool to monitor everything. But, with the help of literally just a few apps, you can keep track of everything that’s important. You can also combine the data from several sources – but we’ll talk about it more later.
The monitoring services ensure that which is most important – which means:
- reports which allow to analyze the variability of the individual values over time
- alerts which immediately inform about any significant changes made to the selected features and indicators.
Why focus on SEO traffic? OnCrawl helps you understand how your organic traffic is distributed and how usage metrics matter for your SEO.
What can (and should) be monitored in SEO
The fact that you own a website (or your customer owns it) and you have access to the CMS and all of the configurations doesn’t mean that you know everything there’s to know about it. It’s very easy for you to miss important problems which affect not only the SEO.
- The organic traffic on the website – apart from the traffic scale, in particular the Bounce Rate and the Conversion Rate. You surely pay attention to this, but do you monitor it regularly?
- Availability (uptime) – if the website doesn’t work, you know what happens – it won’t convert anybody, and it may even discourage a user towards a given brand. Continuing problems may result in deindexing.
- Loading speed – after a website’s been optimized for speed, you can’t just close the subject and forget about it. After all, it’s enough that someone publishes a huge bitmap on the main page, to ruin the entire work. You must detect such situations and react.
- Correct functioning – a glitch in an important function of a website is a silent killer of conversion – the website’s operational but you can’t place an order because the button at the last step of the purchasing process doesn’t work. Sure, you’ll see the effects in the drop in sales, but it’s much better to detect a problem and fix it before you start losing money.
- Domain and SSL certificate expiration – as we all know, domain name registrars and SSL certificate issuers are very glad to remind you that you need to renew your subscription. However, things can happen, especially if you’re using multiple email inboxes and aliases. You can miss such a situation – and an additional reminder from the external monitoring won’t hurt anyone.
- Being present on blacklists – a red warning screen displayed by the browser instead of the website itself is a situation which everybody’d prefer to avoid. This most often means that a website’s been infected by malware and has become a threat to the users. In such a situation, you must react immediately so that as few users as possible encounter such a message.
- Robot blockers – if you’ve never experienced this, cast the first stone. The draft version of the website transferred to the production along with the robots.txt file including the search engine robot blockade. Or the X-Robots-Tag in the HTTP header, which is invisible at first sight. It’s better to detect such a mistake before Google updates the index as per our own “request”.
You surely observe the external results of your actions, but observing isn’t the same as monitoring. It’s good practice to regularly check the reports and to receive alerts at key moments – instead of looking at the results all the time.
- SERP position – the basis for determining the effects of the SEO activities, and often – of the amount of remuneration for the SEO services. Very difficult to track on your own – not so much due to the fact that many repeating queries are blocked by Google, but rather to the extensive personalization of the search results and the constant appearance of new snippets changing how the SERPs look like.
- Google efficiency – the number of SERP views, CTR and clicks, the crawling speed and errors. First-hand data, meaning data directly from Google – from the Google Search Console.
- Other indicators – such as Trust Flow (TF) or Citation Flow (CF) are used mainly to determine the value of websites as potential link locations. But it’s also worth to monitor in this respect how the optimized website’s doing.
- Backlinks – the SEO “currency”, the direct result of content marketing and many other activities. Some are worth their weight in gold, others aren’t worth as much. The need to monitor the backlinks is manifested on two levels:
- the general – numerical/qualitative one
- the detailed one – which means tracking the specific acquired links – if they haven’t been deleted or edited (e.g. by adding the “nofollow” attribute).
Google Search Console
The most important tool to track the effects of the SEO activities is the above-mentioned Google Search Console. Unfortunately, it doesn’t offer configurable alerts – notifying, for example, about a decline in the SERP positions. Of course, GSC does send various types of alerts, but it does it in an entirely independent way.
Google Analytics is also very important – or alternatively, any other package used for tracking traffic and user behavior on the website. You can define your own alerts and get notifications here, when something unusual happens to the traffic on your website.
Majestic is a source of very valuable data regarding a SEO-optimized website – this including the links leading to it – it has its own index built in exactly the same way as the Google index.
Some of the advanced SEO platforms – like OnCrawl for example – offer continuous monitoring of the overall health of a website. The service tracks various metrics collected by crawling and log analysis.
Services such as Super Monitoring are used for on-site monitoring – in particular within the scope of availability, speed and the correct functioning of websites. They test the website even several thousand times a day for various irregularities, which they detect, record and report.
Position monitoring in the search results can be handled, to a certain extent, via the Google Search Console. However, the possibilities are limited here – and we’re missing the above-mentioned alerts. Majestic and OnCrawl also come to the rescue here with their rank trackers.
As a result of the above, you need at least four tools to be able to fully monitor the SEO. If you configure the alerts in all of them, you’ll protect yourself against missing any important changes in the situation of the positioned website.
But what about the reports? Logging into several different applications day by day and comparing the data in separate windows is torture.
Source: SEO Reporting Dashboard – by Windsor.ai
Fortunately, most of the monitoring applications share their data via API – and you can combine several data sources in “dashboard” type solutions. Google Data Studio is the most well-known one. Interestingly enough, you can connect to it data from sources not offering API – e.g. directly to a database. Owing to this, your key indicator dashboard may also include information from the website itself.