Senior-SEO-What-skills-do-you-really-need-250px

Senior SEO: What skills do you really need?

November 15, 2022 - 18  min reading time - by Riccardo Mares
Home > SEO Thoughts > Senior SEO: What skills do you really need?

The SEO Specialist figure is somehow mythological, orbiting within the digital world and whose features are of the most diverse. If we asked people what she does, we would receive a wide array of answers. Some may think that SEO experts have magical powers, others might believe that they have direct contact with the god Google and may even wonder if paying Mr. G enables SEOs to do whatever they like – of course this is not the case!

A few weeks ago, I hence composed a list on LinkedIn of the essential skills that a Senior SEO is expected to possess. Partly out of haste and partly out of desire, the list turned out to be incomplete and that triggered a lot of hype, resulting in dozens of interesting comments.

In the list, there are over forty attributes that an SEO superhero should have – a superhero who, with a simple gesture… poof, manages to rank first for every keyword!

But before we go deeper into these skills, it is worth recalling that search engine optimization is a truly broad matter. Finding SEOs specialized in one or more areas of expertise is surely easier than meeting people with all the requirements in the list; nonetheless, some basic knowledge must be considered: condition sine qua non.

SEO fundamental knowledge

Any SEO specialist must know and understand the ecosystem in which they operate – how Google works, what Googlebot does, what the status of a page is from an SEO point of view, how SERPs are formed and how they evolve – including all the add-on elements that pop up daily in the middle of the results.

In addition, they must be able to perform keyword analysis, assess rankings, evaluate changes recorded by tools – primarily Google Search Console/Google Analytics – and assess the structural and content correctness of a site.

Crawlers, algorithms and directives

It’s mainly important to understand how the crawlers work (Googlebot primarily) and know how to communicate with them. Communication is mainly based on three levels: direct communication, indirect communication and offsite signals.

Direct communication refers to tools we use to guide search bots across our site. They are basically signals that control where and how Google crawls a site. Among them are robots.txt, meta tags, x-robots-tag(s), <link> tags, status codes. It’s important to be familiar with these because it enables you to communicate to the bot your preferences about their behavior. Some also use it for privacy reasons, but please remember that real privacy comes from not publishing content online!

Indirect communication explains the content and the semantic relationship between the pages.

Finally, offsite signals are actions that take place off of your site which also have an impact on your ranking. Good offsite SEO allows a website the ability to reaffirm its trustworthiness and authority. Backlinks are the major player in regards to offsite signals.

Data analysis and SEO tools

Once you have a grasp of the fundamentals, it’s necessary to understand what happens “inside” and through the search engines. So, we need to improve our skills in data analysis and learn how to efficiently use SEO tools.

The main tools the SEO must master are:

  • SERPs
  • Search engine tools
  • External tools

The SERPs are the battleground: we can check many queries, analyze what kind of media the search engines display and – of course – check if a resource is indexed or an entire path of a site is indexed.

Our best friends are two operators and we can use them to include or exclude (-) results:

  • site:
  • inurl:

In the SERPs, there is also the number of results counter; but be wary when assessing the data. It is often considered to be “arbitrary” as it changes too many times analyzing the same query, even moving from one page to its next or previous.

For deep analysis, it’s best to move to SEO tools and also, in this case, we have two kinds to consider: proprietary tools and external tools.

The leading tools are owned by search engines and their names are:

  • Google Search Console
  • Bing Webmaster Tools
  • Yandex Webmaster Tools (Russia)
  • Baidu Webmaster Tools (China)

It’s mandatory to remember that these tools are your only direct way to check what occurred in relation to your website inside the specific search engines and it’s mandatory you learn how to interpret the data and the graph flux.

For example: do you know there’s a direct correlation between Googlebot’s response time and the number of pages Google reads? Or, it seems that there’s a correlation between the visibility on news carousel / discover and the CrUX values?

I’m sure you still use GSC and you’re asking why you should use BWT, given how little it is used by the webmasters managing Microsoft’s search engine. Good point, I’ll address it later.

In Google Search Console, you can find much information about:

  • How many pages are indexed;
  • How many pages are excluded (and why);
  • Test pages for crawlability and indexability.

You can also see:

  • Performance metrics about Core Web Vitals, mobile compatibility and response time
  • Crawling stats
  • Ranking results

If you can find all that in GSC, why also use BWT?

I often use BWT because the index report of Google is too minimal, even if in the past few years Google has implemented many improvements. In BWT you can find a great report of the index status similar to the standard Windows File Explorer: WOW! And yes… you can check the indexation status of resources by browsing paths and subdomains!

Sure (I’m listening to you) this isn’t the Google index, but what do you think if we assume that if Bing sees some resources, Google has just seen them?!

All that being said, why do we need to use external tools to check the site’s health? I think there are two main reasons:

  • External tools can be used to analyze the web, your site’s statuses, where your site ranks on the SERPs and your backlinks.
  • With the search engine tools, you can only check the data of your own site (additional tools can also help you see what your competitors are up to).

 

I use external tools in the following ways:

  • To compare real, organic traffic data with the traffic data provided by the rank tracker tools. I think it’s a good solution to understand if the SERPs include distractions that impact Click Through Rate (CTR) and therefore the organic traffic. Usually, these kinds of tools tend to ignore such elements in addition to traditional organic results.
  • To compare sites I’m working on with their competitors. Please, when you do this kind of analysis, never never never compare results between different tools. Every tool has its own algorithm to convert rank and query volume to traffic projections.
  • To analyze competitor link-building activities.
  • To analyze sites for which I’m working on backlinks. Sure you can find the real data on the GSC report, but it’s so poor that we need to integrate it with external data (anchors, rel, source page details, etc…).

I can’t forget to mention the debug tools that permit me to emulate seo crawlers and check – in bulk mode – the page elements as well as how the links connect. When I talk about links, I mean not only <a> tag, but <link> too and every source inside the page.

During a crawl, a healthy site:

  • Replies “only” with status code 200 and every page is good for indexation
  • Tends to have ZERO redirection status codes (3xx)
  • Has ZERO error status codes 4xx
  • Has ZERO server work errors (status code 5xx)

This kind of optimization, acquired using robots.txt and rel=”nofollow” wisely, allows you to optimize the crawl budget by guiding the bot where we want and need it. Yes, I suggest this kind of activity even if you don’t have a large or a very large website!

[Ebook] Using projections to reinforce an SEO strategy

Discover different methods and results in the field of SEO forecasting.

Page container

During my SEO courses (when I did them), I would usually say: “The best content on a poor website could become poor content.” Something similar may happen if you put a great chef inside a restaurant that’s externally unappealing: the risk is that no one will get the chance to taste the delicious dishes.

In the previous chapter, I talked about the general state of the site, but it’s also important to keep in mind the three dimensions of organic website success (crawlability, indexability, rankability). When considering those three things, the first place to start is with your webpages.

The page is the container of the content, the frame that enhances the picture. As SEOs, we need to make sure that the main SEO tags exist (title, h1, meta description). We have to be concerned about the visual page stability during the load and the interactions, for a few months.

An SEO has to check if the page loads fast enough as well as if the main content loads first and holds the prime visual position above the fold.

In the last year, I have seen more and more websites built on JavaScript technologies and this kind of programming evolution has as a consequence that SEOs – supported by programmers – need to understand. Search engines don’t only see the HTML (page source), but they analyze the DOM that is the result of the HTML code with the CSS and the JS “modifiers”.

Please remember: Googlebot is Chrome and – for some years – its version directly follows the Google Chrome browser releases and yes, it can see any kind of DOM rendering, excluding what is caused by user interaction (click, scroll, …).

Do you think the list is getting too long? Well, I haven’t even started to talk about structured data!

In the SEO community groups, usually, the term “semantic web” is used. Some use it to talk about the content relationships, but the correct use of this concept is related to structured data, pieces of code similar to JS that permit you to add to traditional content categorized information.

For example, if you have a preferred recipes page – via structured data – you can specify to the search engine the name, the calories and the preparation steps. Then, for some data types, Google will convert your work into a rich snippet. Have you already seen them?

Page content

The hardest branch of knowledge to improve is related to semantic SEO. In this case, I’m not referring to semantics related to structured data, but more specifically the page content. The difficulty comes from the need to create content.

It requires that we improve:

  • Keyword analysis/search intent analysis/clustering
  • Morphological analysis of SERPs and their changes
  • Content and meta-content analysis (but the copy writing is handled by the content team)
  • SEO conflicts/cannibalization identification and resolution

Working on content is a hard profession and I’m sure you’d agree with me that it shouldn’t be the job of the SEO to write content or perform technical audits! Why and how should SEOs be involved in the semantic context?

Our mission is to understand how people search what your site has to offer and how search engines interpret queries and consequently decide the kind of web pages to display in SERPs.

Before starting a new website, I suggest that the SEO be involved from the start, to analyze the organic market, identify SERP competitors, detect the kind of SERP widgets and so on.

Then, in accordance with the “information architect”, they will design the website structure and the SEO needs to establish guidelines for content creators and e-commerce managers too, for every page and every level. The mission is to avoid cannibalization and to set the content priorities by design.

What if the site already exists? Start praying to have a lot of collaboration with the existing team (programmer, content writers, UX/UI designers, e-commerce managers, …). My suggestion is to start with a “helicopter” view: check the main flaws and then try to plan a continuous loop of issue->(re)solution, until you get to fix minor details.

Advanced SEO knowledge

Often we are asked about the difference between a Junior SEO and a Senior SEO. From our point of view, the evolution of the SEO professional figure towards seniority is characterized by the expansion of her/his skills, and much more! Senior SEOs have developed a strategic vision integrated with other digital activities and are able to manage projects independently.

Going back to the SEO skills, here are some ‘super powers’ that allow the SEO specialist to distinguish and take the Site-vs-Google challenge to a much higher level.

Strategic capabilities

The strategy is something that isn’t measurable as the rank or the traffic. Strategy is a high-level view of a project based on the knowledge of the system, the knowledge of the project and a great understanding of the mission and the expected goals.

The SEO strategy takes precedence over an SEO audit: the audit is the result of a picture of the project in a given moment; the SEO strategy is the vision of the current situation and the plan of what kind of actions we should take to achieve the objectives set.

To create a good strategy it’s mandatory to join all your technical knowledge with years of experience. SEO isn’t an exact science and a mid-long period of planning requires mixing many kinds of skill, even something that cannot be enclosed in a definition.

Competitor analysis

Directly linked to the strategy, competitor analysis is a skill that permits the interpretation of what happens in SERP or permits understanding the reason why ranks change or CTR wavers.

On which kind of data do we base this kind of analysis? Unfortunately, the official tool of the website owner (GSC, BWT, …) can’t help us as much as we’d like. Data that can help the experienced SEO includes:

  • Ranking (you need to still have keyword analysis!)
  • Traffic projections
  • Backlink profile comparison (quantity and quality)
  • Content comparison

For the first three points, the only solution is to use external tools (SEMrush, Sistrix, AHrefs, Majestic SEO, …). But, for content comparison, I absolutely suggest doing it manually. In the past few years, a lot of “semantic” tools were created, but I truly believe that the right interpretation of the content has to be in the hands of actual humans. And yes… I’m listening to you: Google uses algorithms to compare content, but we have no way to use those algorithms by ourselves.

Log analysis

One of the “dark” sides of SEO is log analysis. Some months ago, I wrote an introduction to log analysis for Oncrawl about this and every time I try to explain the value of this activity, I encounter the wide open eyes of SEOs, system administrators, webmasters and marketing managers.

The principle of log analysis is to understand, without interpretation, what a bot sees on our website when it reads pages, how much or how often it reads pages and what pages were never read at all.

Sure, GSC or BWT give some interesting overviews about their activities, but usually they just share sample data, grouped data or unfiltered data. With the log files, an SEO can meticulously examine bot behavior and find answers about issues he or she sees when studying graphs or analyzing the site.

A tool like Oncrawl also permits you to go one step further: it combines join log data (from the bot) with log data (from the users) and qualified data such as Google Analytics, Google Search Console or whatever other data you can import in the platform. It’s a really helpful tool for a comprehensive analysis of your site.

[Case Study] Find and fix indexing issues

From zero traffic to exponential growth: discover how GroupM Turkey used Oncrawl to solve Unilever's indexing issues.

Speed analysis

For many years, Google has continued to repeat – as a litany – that site speed is important. For CRO/UX specialists and marketing managers, speed is crucial because it is directly related to the conversion rate and is directly related to MONEY. If you try to search online, you can find hundreds of insights that demonstrate the percent of improvement in conversion rate, for every millisecond saved.

On the SEO side, performance is directly related to rank, crawling, and consequently indexing

Let me elaborate a little more on what I mean when I refer to these relationships:

  • The better the site response time is, the more pages Googlebot will read by day (either more pages or more frequently).
  • The faster Googlebot reads pages, the faster it will be able to index them (or refresh its indexes).
  • The better the page response time is, the better the rank position will be. We don’t know exactly how much page-speed is worth in regards to ranking position, but it’s pretty safe to say, the fastest wins!

Up until this point, I have been talking about response time and usually I simplify things and usually call it server performance. But there are other important metrics to consider. For some months Google has introduced additional metrics to evaluate site experience: the Core Web Vitals.

CWV are a group of indexes that try to calculate the quality of user experience. Google collects these metrics anonymously from the Chrome browsers of the real users (if they don’t block this). The main metrics are:

  • LCP (Largest Contentful Paint): The amount of time to render the largest content element visible in the viewport, from when the user requests the URL.
  • FID (First Input Delay): The time from when a user first interacts with your page (when they clicked a link, tapped on a button, and so on) to the time when the browser responds to that interaction.
  • INP (Interaction to Next Paint – soon): Observes the latency of all interactions a user has made with the page, and reports a single value which all (or nearly all) interactions were below.

There are also other metrics that SEOs have to use to evaluate (client) performance, but for now, they don’t affect the rank: TTFB, FCP, LCP, FID, TTI, TBT, CLS.

An SEO isn’t a front-end developer nor a system administrator, so knowing how to program isn’t a necessity. What SEOs do need to know, however, is how to understand performance reports, both client and server, and how to dialogue with programmers and sys admins to improve site performance, traffic and conversion and customer satisfaction.

Other important skills to level up

I think the SEO career has two vectors of growth, either vertical or horizontal. Vertical means you aim to focus on and improve one specific skill like auditing, log analysis or content analysis.

Horizontal growth means developing T-shaped knowledge; defining one area of expertise, but also learning a bit about a number of other areas of SEO. If you’ve found you are more focused on the horizontal growth vector, you may have already experienced that the rule for SEOs often changes based on the team with which they work.

If an SEO is working with:

  • Content producers, they share traffic analysis and keyword analysis
  • Developers; it’s usually to optimize the site and its speed performance
  • Sys admins, it’s to optimize the speed performance
  • An analytics or marketing time, they have to understand “economic” performance data and the role of SEO between other traffic sources

The more SEOs can integrate themselves in all aspects surrounding a web site, the more they can improve their work and the site’s organic performance too.

But what about SEO certifications?

Can these skills be certified? My answer is NO and I covered it in depth in a 2019 article “Best SEO Certification(in Italian).

What does exist are certifications provided by SEO tool developers or web agencies. In the best-case scenario, the latter represents a confirmation – given by SEO experts – that a training path has been (at least) followed and that examinations have been passed. At worst, they are trivial attendance certificates disguised as certifications.

From an academic point of view, there are no specific courses and personally, I am not in a position to recommend a study course over another. An IT mindset helps a lot for technical activities, while an economic or humanistic competence is a boost for strategic and creative activities.

And yes, SEO creative activities do exist!

A good dose of imagination – linked to knowledge of a specific market sector – makes it possible to intercept the search intentions of consumers (even the latent ones) and propose SEO content strategies that respond to intentions and bring quality traffic.

Because – in the end – the “trivial” task of SEO experts is always this one and only: improving quality traffic to a website over time. By quality I mean that it adds economic marginality to the business in the short, medium and long term.

The rest is a good mix of metric aspirations and fluff 🙂.

How to become an SEO specialist and what will be the future of SEO?

In this article, I tried to describe the main skills and characteristics of the ideal SEO specialist – the “superhero”. However, as I said, there are still no ad-hoc certifications or academic courses for those who want to take this path.

So what can I suggest to those who would like to enter the world of SEO and improve their profile? In other words, what is the future of SEO specialists?

Here are the words of Andrea Melloni, Head of SEO at Studio Cappello – one of the longest-established SEO agencies in Italy:

“Curiosity, accuracy and strategic mindset: I believe that these are the three fundamental skills that – beyond the knowledge acquired through tests and training – turn a Junior SEO into a Senior. Curiosity implies eagerness to learn and go further. Accuracy means taking care of every single detail and thus outlining a path to follow. The strategic mindset, on the other hand, is essential to manage projects and communicate with customers effectively. Of course, mistakes can happen which are nothing but steps that we take before reaching coveted results.”

Thanks for your attention.

Riccardo Mares See all their articles
Riccardo started to work at the end of 1999 after a degree in computer engineering. In these years he learned programming in Coldfusion, JS, PHP, ASP, SQL, CSS, ... developing web portals, e-commerce, CMS. In 2010 he was hired in Studio Cappello as SEO Specialist and today is the Head of SEO and the COO of the agency. He's one of the main speechers in the Italian events and usually plays on Twitter with worldwide SEOs.
Related subjects: