hreflang

Webinar digest: Top 10 SEO mistakes to avoid in 2017

January 25, 2017 - 8  min reading time - by Emma Labrador
Home > SEO Thoughts > Webinar digest: Top 10 SEO mistakes to avoid in 2017

Last January, 19th we showcased a webinar about the top 10 SEO mistakes to avoid in 2017. In this recap, we will talk about the mistakes to banish from site migration, mobile, JS to pagination and we will present new work methods.

1# Too thin content

Thin content is a common mistake you have heard of many times. However, Oncrawl can help you spot this type of content more easily through our “Content” tab. We identify pages with less than 300 words but you also need to consider:

  • Your website’s size ;
  • Your website’s typology.

It is thus necessary to avoid self-generated content and to publish too short one. Indeed, Google fosters unique, interesting and dense content. The more Google rewards user experience, the more usage metrics are taken into account:

  • Time onsite ;
  • Bounce rate.

contenu pauvre identifié via oncrawl

In that logic, Google is ranking websites regarding the content density needed to get listed and indexed.
The following graph shows a combined analysis comparing Google’s bot behavior on pages thanks to crawl and logs data.

fréquence de crawl en fonction de la densité du contenu

In that example, the number of words is a key element that influences the crawl frequency. The less content you have, the less Google will index and rank a website. In the same logic, the less content you have, the more likely you are to have near duplicate content.
The following request lets you identify pages with content with less that 300 words and with a content similarity ratio higher that 90%.

requête contenu near duplicate oncrawl

If these pages have a few words and are identified as near duplicates, it is easy to fix them with a canonical or a no-index. A no-index remains the best solution for e-commerce or news websites that use standard descriptions.

2# Too long titles and h1 tags

Too long titles get truncated on mobile but also on desktop. They can penalize your click-through rate in search results. A long title with too many keywords can also lower the impact of your primary keyword. In the following example, the website tries to rank on the “Shoes” query.

ranking on shoes query

To make it simple, Google measures the keyword stature inside a chain of characters regarding the number of words and occurrences to determine a weight for each word.
To rank on a product, the title needs to contain the brand, the product name as Google identifies named entities. Also, it is better that your H1 contains the product reference for a better identification.

3# An indexable page for each product declination

To avoid indexing all the product declination, a system of doorway page is the best solution. That doorway page will be the page ranked for the product name. It can be penalizing to let Google indexes all the product declinations. That’s why a follow no-index and the use of canonical URLs pointing to the doorway page is a good solution. From these pages, avoid to link to doorway pages that are not semantic-related. It is important that these pages use a Follow because they can however drive some link juice to close semantic-related pages.

4# Don’t focus on content segmentation

Think about your internal linking architecture because it lets you know if it answers your business goals. The main mistake would be to have a segmentation of page groups with different goals but with the same page depth distribution. Your priority pages must be close to the homepage.

5# Don’t pay attention to usage metrics

Google is giving more and more importance to usage metrics. It is necessary to banish:

  • Too aggressive ads: think about the “Page Layout” algorithm that penalizes sites with too many ads above the fold ;
  • Interstitial and popups on mobile and desktop: that penalty is related to the new JS interpretation.

It is thus crucial to monitor your usage metrics like the bounce rate by types of device and group of pages to identify pages with issues. Time onsite and the number of page views from SEO visits are key data to monitor. You can access them through a combined analysis of traffic and crawl data.

6# A supernatural linking

The netlinking part is also important because it needs to be natural. Tools like Majestic helps you monitor your netlinking and check if it is qualitative. If you get between 100 and 600 referring domains on a highly competitive query, chances are it is not natural.
netlinking surnaturel identifié via majestic

Same goes for anchors, it is impossible that a page receives exact match anchors for a specific product.

qualité ancre de lien via majestic

Link quality and link topic are also important because they have a huge impact.

7# Be too focus on positions

Positions are now too uncertain. You should not only consider that KPI. Google tries to offer a custom experience by analyzing your context of research. Are you mobile? Are you on desktop? What is your geolocalisation? For the same query, search results can be completely different regarding the context. It is important to only analyze wide trends regarding your positions because it is hard to get an exhaustive monitoring and know your exact place.
Prefer other indicators such as the SEO visits volum but also the number of active pages. An active page is a page receiving SEO traffic. The main purpose is then to increase the number of active pages.

volume de pages recevant du trafic SEO

By monitoring these indicators, you will have a better understanding of your website’s SEO sanity. If you only focus on highly competitive positions, you will miss opportunities with middle-train pages that generate more SEO traffic.
They must be the indicators you are working with in 2017 to measure your level of performance and communicate internally. You can access these data through your logs or through your Google Analytics data in Oncrawl.

8# Not migrate to https

You should expect a drop of traffic for websites that remain in http. A SSL certificate proves to Google’s eyes your website’s quality and save your user experience. When migrating your website, it is also the perfect time to clean it up because you are going to do a lot of 301. Don’t hesitate to:

  • No-index pages: if you have old redirections from previous updates, check within your logs if Google keeps fetching them. It could be useful to delete old redirects or to block them with a robots.txt. Google should not keep following these redirections because if they are still here, Google interprets them and attributes the link juice to old URLs. Unless they have interesting backlinks, stop accumulating redirections after each update.
  • Monitor your logs: with a 100 pages website, a HTTPs migration can be possibly done manually. With a site dealing with thousand of pages and driving a respectable amount of traffic, it is hard to realize a https migration without appropriate tools to monitor your logs.

Good practices for a good https migration

  • Get a good understanding of all your URLs.

To do so, you need to crawl your website to get an exhaustive view of your pages and avoid missing some pages. By segmenting your urls, you can know which urls you want to deliver some juice to from old urls you want to remove and thus redistribute your pages’ popularity.

  • Prepare your redirection plan by monitoring issues on orphan pages.

An orphan page is a page that don’t belong to your website’s internal linking structure but that Google knows. That information is accessible by combining your crawl and logs data. If Google knows these pages, he keeps visiting them and thus wastes resources on pages that should not be your priority. They are not generating a lot of traffic and no links are pointing to them. In this example, the website is not optimized and can’t rank on the expected queries because only 42% of the known pages belong to the structure whereas 82000 pages are orphans and crawled by Google.

pages orphelines
Look if these orphan pages generate some traffic. If so, it could be clever to redirect them and avoid adding a no-index everywhere.

  • Crawl a staging version vs a live version and vice versa to identify evolutions.

Do you have improve your pages’ internal popularity? Does the number of pages have evolved between the staging and live version? Oncrawl lets you compare crawls and identify movements regarding your internal popularity distribution.
comparer une prod d'une pré-prod

  • Analyze a live version by monitoring the logs behavior.

Take the time to consider your status codes in real time. You could identify regressions, increase of 302 or 404.

identification hausse de 404 suite à une migration

That reactivity helps you avoid to receive delayed alert messages coming from the Search Console. You can optimize your crawl frequency and the number of active pages.

9# A bad internal linking

Even if you thought your internal linking was on point, the log analysis is the only way to let you understand how Google interprets it. For example, Google could allocate an important part of its resources to a small portion of unique and useless pages. Pages generating revenues would be less visited. Check with the logs the impact of optimizations like breadcrumb or navigation bar modifications.
These changes can sometimes increase your traffic by 30% or 50% in a few weeks only because the crawl budget cleverly allocates.

10# Not identify priority improvements

It can be hard to identify priority SEO tasks. It is often interesting to start with groups of active and inactive pages.

pages actives et inactives par groupe de pages

In fact, your brand, product and editorial pages don’t have the same goals and the same value. Some have conversion goals, other have average basket higher than other, etc. Knowing your business goals and looking at your active and inactive pages will let you focus on the inactive pages generating value. Then, you will be able to identity the next task.

Bonus

  • Think multi-device and mobile user experience ;
  • Focus on payload: in a mobile environment, the payload will impact usage metrics ;
  • Crawl your website with javascript to have a clear idea of the load time of your overall payload. Feature available in the crawl settings ;
  • Control your tags: the Oncrawl Custom Fields let you scrap elements that belong to your pages like Google Analytics tags. Available on demand.

Watch the replay

Emma Labrador See all their articles
Emma was the Head of Communication & Marketing at Oncrawl for over seven years. She contributed articles about SEO and search engine updates.
Related subjects: