x10 Internal links to pages for top searches
101 000 Daily organic visits (2016)
295 000 Daily organic visits (2018)
Paris Match is a French weekly magazine treating the latest news and images, born in 1949 and famous because of its motto: “The weight of words, the shock of pictures”.
The brand has faced a clear problematic: deeply auditing its site, identifying its strengths and weaknesses, determining its priorities and fixing the blocking factors for Google’s crawl.
The brand has faced a clear problematic: deeply auditing its site, identifying its strengths and weaknesses, determining its priorities and fixing the blocking factors for Google’s crawl:
- Dealing with an important volumetry of URLs;
- Identifying the structure issues (redirections, redesign, arborescence, etc);
- Determining a consistent content strategy in accordance with the editorial line;
- Tackling the duplicate content.
By implementing a log files analysis with OnCrawl, Julien Ferras and his team have been able to increase their traffic from Google by more than 80 % thanks to:
- The identification of pages with empty or duplicated Titles and/or Meta Description tags and corrective actions quickly implemented;
- The detection of many 301 redirects and 404 errors;
- The identification of pages which didn’t generate traffic but were still crawled;
- The arborescence redesign of the website and the definition of an efficient internal linking strategy;
- The verification of the optimizations’ impact by combining log files and Search Console data;
- The understanding of the impact of the word count and number of links on the traffic generated.
Today, Paris Match records more than 17 millions visits each month and 100 000 000 page views. This use case aims to introduce the methodology used to improve the visibility of Paris Match and the good practices to reach these results.
The Use Case
- About Paris Match
- About Julien Ferras
- Identifying blocking factors for the GoogleBot
- Crawl Budget Notion
- Implementing a method
- Segmenting the content
- Digging into the data and understanding the bot behavior
- Combining data to find the good values
- Determining a specific action plan
- Identifying a clear action plan
- Identifying “poor” pages
- Increasing the Google’s crawl quality
- Increasing the load time
- Optimizing the internal popularity thanks to the InRank
- Up pages in the arborescence
- Increasing the content density UX
- About OnCrawl