28th Sep 2016
The SEO industry has experienced a dramatic rate of change in best practice since early in 2011 due in no small part to changes made by Google to its algorithms. Not since the rise of Google in 1999 has the term “search engine optimisation” seen such a transformation in what it means in practice.
With the release of Google Panda in Feburary 2011 (has it really been that long?), on-site SEO became far more than worrying about title tags, keyword relevance, headings and link architecture. It’s now about delivering a great user experience, minimising redundant pages, streamlining the user journey and producing engaging, useful content. Those SEO agencies which weren’t already concerned with usability and user experience optimisation have been forced to adapt and evolve. That’s a good thing.
When Google Penguin appeared in April 2012, the practice of link building changed forever. Not only do low quality links provide less and less value to a website, they are increasingly likely to damage the rankings of even the largest, most trusted websites. Within the space of a few months, online link directories, press release ‘syndication’ sites, free-for-all article hubs and blog commenting flipped from being the mainstay of cheap and cheerful “SEO packages” worldwide to being a toxic issue that spawned a whole host of new industry jargon, from link cleanup to disavows and “Penguin audits”. Link building has become content marketing, quality is everything and engagement is king.
With the exception of a minority of websites who have been unfairly penalised by these changes, the net effect has been overwhelmingly positive for an industry which has struggled with a poor reputation, a vast number of SEO quacks offering miracle results using tactics that are little better than email spam, and a disconnect between “what’s best for the user” and “what’s best for the search engine”.
The latest change by Google, however, has no benefit for the industry, arguably no real benefit for users, and frankly seems to be a thinly veiled attempt to make more money.
During October 2011, Google announced that when people seached using the secure (https://) version of its search engine and clicked on the result, the actual search phrase used would no longer be provided to the destination website. In every tracking and analytics package, these visits would show up grouped under the term “(not provided)”. The stated purpose of this change was for privacy reasons, however there has been no change to clicks from paid search (PPC) listings which still provide exact keyword data for all searches, whether secure or not. It’s not surprising then that many within the SEO community are cynical about this change, suggesting it’s a way of persuading businesses to invest more into AdWords in order to be able to access keyword level performance data.
Google has made several changes to increase the number of people using the secure version of its search engine, including making secure search the default for all users of its Chrome web browser. However, over the last month there has been a rapid growth in the percentage of visits appearing as (not provided) and Google now admits that it plans to make 100% of search queries secure (and thus “not provided”) in the coming months.
We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….
The motivation here is not to drive the ads side — it’s for our search users.
The Twitter account NPCount tracks the average percentage of (not provided) search traffic across 60 websites, and the increase is clear. Nearly 80% of traffic is now showing as (not provided), which makes analysing performance of individual keywords statistically redundant.
Our own analysis of (not provided) for a range of client websites suggests this change is rolling out globally rather than just in certain markets.
|% (not provided)||59%||53%||53%||58%||64%||63%||57%||64%|
Firstly, it’s important to state that organic search marketing is not “dead”, nor is it now impossible to measure. It’s just harder. The search volume hasn’t changed, user behaviour hasn’t changed, and the commercial value of ranking highly in organic search results for high volume, relevant search terms has not disappeared overnight.
What this change does make harder is the ability to analyse and report on performance of individual organic search keywords. It will soon not be possible to report on exactly how many visitors a specific search ranking delivered to a website. In many ways this is not critical. For quite some time organic search marketing has been about holistically improving the performance of an entire site and ‘keyword themes’ rather than specific keywords… especially due to the fact that focus on individual keywords with anchor text optimised links is a high-risk strategy in a post-Penguin world. Reputable search agencies should now be reporting and measuring themselves on overall organic search traffic trends, but more importantly on the actual number of relevant, valuable conversions their campaigns are delivering.
That’s where this change really hurts. It is now impossible to analyse on the keyword level how traffic performs on your website – to see its exact conversion rate and overall value. That potentially makes proving the value of organic search difficult for in-house teams and agencies alike.
Firstly, it’s useful to mention a few areas where keyword-level analysis is still possible, and increasingly where agencies will turn to in order to model organic search performance.
Ultimately, organic search performance analysis will follow two paths: overall organic search performance analysis to track the impact of all activity, and page-level analysis to inform keyword and conversion descisions.
It’s relatively simple to measure the overall quantity of organic search traffic and see the total impact this has on conversions and revenue, and the change to keywords showing as (not provided) will not change this. Smart agencies are already comparing organic search traffic figures with Google Trends and other historic search volume data to indicate whether a client is out or under-performing the market, and this will remain an important part of performance analysis.
One key area where external activity (both on and off-line) can have an impact on organic search is on branded seach volume – the number of people who are searching for you by name. The move to (not provided) will make this much harder to track, but using external tools like historic brand search volumes from the Google AdWords Keyword Planner and brand PPC activity will allow agencies to continue giving a good indication of brand vs non-brand traffic trends.
Rather than fixate on keyword level data that is no longer available, there will likely be a move towards analysing page-level metrics in order to analyse the effectiveness of organic search efforts by both brands and their agencies.
In summary, this latest change by Google feels frustrating and cynical at first look. Whether or not it is a true reaction to users’ increasing need for privacy in the face of PRISM or a cheap attempt at increasing AdWords revenue, there are a lot of people in the SEO community who are angry and concerned by this latest update.
For those agencies flexible and switched on enough to react and change performance analysis accordingly though, this represents an opportunity to further align SEO analysis to real business objectives, better measure the wide-ranging positive impact of organic search marketing beyond “traffic from target keywords” and will help differentiate worthwhile agencies from the many poor agencies and “SEO consultants” who still plague the industry.