- Use cases
- Customer Success
- LOG IN
- Start free trial
2016 has been a turbulent year for SEO and arguably, has seen the most substantial changes for a number of years.
We saw major changes to algorithms affecting mobile and local search being rolled out from various search engines, as well as the long awaited update to Google’s Penguin algorithm.
One big trend in 2016 however was voice search, and that more and more searchers are adopting it. This has been brought on by advancements in personal assistant technology, Siri, Cortana, Alexa and more recent developments such as Google Home, the Amazon Echo and Mayfield Robotics’ Kuri, which was design lead by a Pixar animator (I’ve included on of their short product YouTube videos at the end of the post).
It’s expected that in 2017, the number of smartphone users globally shall exceed 2-billion, and reach nearer the 3-billion mark by 2020. In May 2016, it was estimated that 1 in 5 searches made on a mobile device in the USA originated from voice search.
As digital consumers we’ve also embraced the internet of things, with our internet enabled wearables, household appliances and entertainment systems.
Voice search began in 2011, when Apple unveiled the Siri feature of the iPhone 4S and since then more virtual search assistants have arisen. By June 2015 Siri was handling more than 1-billion voice queries per week.
Google have now integrated voice search into their search bar in version 46 of Google Chrome, and some modern laptop devices have voice search integrated into their operating systems (for finding internal files).
The need for a keyboard is becoming less of a factor and search engines are evolving with this trend.
In a 2014 Google study, 41% of adults and 55% of teenagers perform at least one voice search per day.
Google’s study showed that when we are on the move, such as commuting, or busy multitasking we are more likely to use voice search over traditional typed search. The study also showed that Bluetooth and internet enabled cars are driving the trend, with users using voice search for queries related to their destination, or journey.
During the Google I/O 2016 keynote, Google revealed data that ~20% of searches on android devices and via the Google Search app are done so by users iniating voice search, the I/O also saw Google introduce Google Assistant to the market.
The assistant is an ambient experience that will work seamlessly across devices and contexts. So you can summon Google’s help no matter where you are or what the context. It builds on all our years of investment in deeply understanding users’ questions.
Google is going to increasingly rely on Knowledge Graph and Rank Brain to both answer, and provide better results for voice initiated searches. This means that the way we view content needs to change.
The old days of optimizing content for keywords have passed, a long time ago. Content in today’s landscape needs to show expertise and authority, as well as score high on readability tests.
Part of this, should include adding structured data mark-up, schemas and micro-format data to your content to make it easier for search engines to understand not only the content itself, but also the context.
Data suggests that the average search query length within Google is 4.29 words, however voice initiated search queries take on a much longer form.
Content should still hold commercial value, but it’s technical and commercial value should have a stronger focus. You should include more terms such as who, what and where, answering questions and at the same time, catering for longer search queries.
In the early days of search engines, people tended to use what we now call long tail queries and over time queries have become shorter and build out to be longer tail queries as users reach different stages of their ‘research’.
Voice search starts off as a long tail query, because that’s how people speak. In order to optimise for this, content needs to be optimised for humans (even though it already should be). So right now, your content may be targeting a small section of keywords like;
Whereas you should be looking to optimise for long tail and question based searches, such as;
It could be argued that Google have been preparing for voice search since 2013, when they released the Hummingbird algorithm. This was evidence from Google that they were switching focus on semantic language and on determining user intent.
This was also somewhat reaffirmed by Paul Haahr at SMX West 2016, where he gave insight into the life of a Google’s search evaluator, and how they determine if a set of search results ‘high meets’ a users needs and intent.
As promised, here is a short YouTube clip of the Kuri home assistant – before you decide you want one, they cost $799, and aren’t on the market yet…