- Use cases
- Customer Success
- LOG IN
- Start free trial
While SEO is a long-term strategy, there are some quick wins you can implement to improve your rankings. Don’t get me wrong, there are still some difficult points to develop, changes that will take time to be rewarded by Google. But as a SEO consultant for instance, you sometimes need to show your clients that your work has a direct impact on their rankings. Those 8 strategies will certainly help you quickly improve your clients’ SEO performances.
Having internal duplicates can seriously lower your SEO performances as letting your indexed pages behave badly in a Panda world is not a good idea. Google needs to pick the most relevant version of a content but this is not the best quality of the search engine. In the end, having too much duplicates can damage your “top pages” as the juice won’t be spread efficiently. In most cases, duplicate content is caused by:
When you have a new URL for the same content, search engines will see it as a new page and will then consider it as a duplicate. So you need to canonicalize those new pages with a canonical tag or a 301 redirect.
To identify your internal duplicates, you can use a crawler like OnCrawl to spot those issues.
Here you can see that you have duplicate content and that some of your canonicals are not set or not matching. By clicking on the area you want to look at, you will precisely know which URLs are concerned.
Here you can access all your duplicate URLs with no canonicals matching. If you click on a URL you will know which one is its duplicate and its content similarity.
You also have the option to identify your internal duplicates by clusters.
And then, click on a cluster of “canonicals not set” and you will immediately know which URLs are duplicated.
Then, you will be able to add the right canonicals to avoid Google penalties.
If you want to rank your content for a specific keyword don’t be vague. Get directly to the point. Some websites are still using elusive anchor texts to link to another page. It won’t be clear to either search engines and visitors. Be specific and informative while avoiding keyword stuffing strategies.
The title tag still does have a significant impact on your SEO and is still underestimated by some marketers. To be efficient and help improve your rankings, title tags need to be unique, keyword-targeted, simple and descriptive. To know if you have any duplicate titles or titles not set, you can also use OnCrawl.
Long titles don’t only turn off visitors (as they won’t access the full information in the SERPs) but also lower the SEO impact of keywords. Dr. Peter J. Meyers from the Moz blog shared this interesting example:
The most common culprit I see is when someone adds their home-page TITLE to the end of every other page. Let’s say your home-page TITLE is:
The Best Bacon Since 1983 | Bob’s Bacon Barn”
Then, for every product page, you have something like this:
“50-pound Mega-sack of Bacon | The Best Bacon Since 1983 | Bob’s Bacon Barn”
It may not look excessive, but you’re diluting the first few (and most important) keywords for the page, and you’re making every page on the site compete with your home-page unnecessarily. It’s fine to use your company name (or a shortened version, like “Bob’s Bacon”) at the end of all of your TITLE tags, but don’t repeat core keywords on a massive scale. I’ve seen this go to extreme, once you factor in long product names, categories, and sub-categories.
Your titles should be between 135 and 159 characters to fit the right length in the SERPs. Again if you click on the area that interest you, you can know where to look for title optimizations. Here for instance, you can quickly know that you have 24.456 pages with too long titles.
The same goes for meta descriptions. They should not be neglected. While they do not directly impact your rankings, they are a great partner to make your content more valuable, increase your click-through rate from the SERPs and your traffic.
That is why you should not let Google automatically generates your meta description. They should be unique, descriptive, keyword-targeted and appealing.
Here again, OnCrawl can deliver you insights about your meta description performances.
If you own a large website with a high amount of pages, you cannot let your pages run in a “flat” architecture. Page depth should not be higher than 3 clicks but in most cases this is not the case. If the path is clear to crawlers and visitors, you won’t be penalized but you could miss some nice opportunities to spread the link juice properly. What you can do is to directly place your best pages to the home-page and link directly. This strategy helps spread the link juice more effectively as your best pages will be on the first level. You can for instance create a section directly from your homepage called “Featured Products”.
Adding rich snippets to your content can help you boost your visibility in search engines results and your click-through rate. They also help lowering your bounce rate and growing user engagement. Those little elements can thus make the difference in the SERPs. You can check your structured data distribution and details and analyze them by page depth with OnCrawl.
404 errors are easy to avoid. Run a crawl of your website and you will easily spot them. While lowering your site’s quality regarding Google guidelines, it also negatively impacts your user experience. You should also setup a customized 404 errors to effectively redirect your visitors.
OnCrawl can helps you detect those pages in a glance and get rid of them.
I would love to hear other onpage SEO quick wins. So if you feel sharing some of them, please do it in the comment section. Also, we still provide a 30days-trial to use OnCrawl. No string attached!